https://github.com/ZrrSkywalker/LLaMA-Adapter
Fine-tuning LLaMA to follow Instructions within 1 Hour and 1.2M Parameters
https://github.com/ZrrSkywalker/LLaMA-Adapter
Last synced: about 1 month ago
JSON representation
Fine-tuning LLaMA to follow Instructions within 1 Hour and 1.2M Parameters
- Host: GitHub
- URL: https://github.com/ZrrSkywalker/LLaMA-Adapter
- Owner: ZrrSkywalker
- Created: 2023-06-14T08:42:30.000Z (almost 2 years ago)
- Default Branch: main
- Last Pushed: 2023-06-14T08:45:33.000Z (almost 2 years ago)
- Last Synced: 2025-02-25T06:45:58.516Z (about 2 months ago)
- Size: 2.93 KB
- Stars: 86
- Watchers: 4
- Forks: 6
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
- awesome_open_llms - LLaMA-Adapter
- awesome-llm - LLaMA-Adapter -  - Fine-tuning LLaMA to follow Instructions within 1 Hour and 1.2M Parameters. (Materials / GitHub repositories)
- StarryDivineSky - ZrrSkywalker/LLaMA-Adapter
- Awesome-ChatGPT - LLaMA-Adapter - Adapter🚀 | 1.LLaMA在1小时内按照指示和1.2M参数进行微调 | (精选开源项目合集 / GPT开源平替机器人)
- awesome-foundation-models - [code - paper]](https://arxiv.org/pdf/2304.15010.pdf) (Visual Chat Models / Chinese Support)
- awesome-open-gpt - LLaMA-Adapter - Adapter🚀 | 1.LLaMA在1小时内按照指示和1.2M参数进行微调 | (精选开源项目合集 / GPT开源平替机器人🔥🔥🔥)
README
# LLaMA-Adapter: Efficient Fine-tuning of LLaMA 🚀
The official codebase has been transferred to [OpenGVLab/LLaMA-Adapter](https://github.com/OpenGVLab/LLaMA-Adapter) for better follow-up maintenance!## Citation
If you find our LLaMA-Adapter code and paper useful, please kindly cite:
```bash
@article{zhang2023llamaadapter,
title = {LLaMA-Adapter: Efficient Fine-tuning of Language Models with Zero-init Attention},
author={Zhang, Renrui and Han, Jiaming and Zhou, Aojun and Hu, Xiangfei and Yan, Shilin and Lu, Pan and Li, Hongsheng and Gao, Peng and Qiao, Yu},
journal={arXiv preprint arXiv:2303.16199},
year={2023}
}
```
```bash
@article{gao2023llamaadapterv2,
title = {LLaMA-Adapter V2: Parameter-Efficient Visual Instruction Model},
author={Gao, Peng and Han, Jiaming and Zhang, Renrui and Lin, Ziyi and Geng, Shijie and Zhou, Aojun and Zhang, Wei and Lu, Pan and He, Conghui and Yue, Xiangyu and Li, Hongsheng and Qiao, Yu},
journal={arXiv preprint arXiv:2304.15010},
year={2023}
}
```