{"id":13645991,"url":"https://github.com/cccntu/minlora","last_synced_at":"2025-04-21T17:31:42.832Z","repository":{"id":130862908,"uuid":"603069918","full_name":"cccntu/minLoRA","owner":"cccntu","description":"minLoRA: a minimal PyTorch library that allows you to apply LoRA to any PyTorch model.","archived":false,"fork":false,"pushed_at":"2023-06-21T23:26:11.000Z","size":16,"stargazers_count":450,"open_issues_count":8,"forks_count":31,"subscribers_count":10,"default_branch":"main","last_synced_at":"2025-03-13T05:33:09.690Z","etag":null,"topics":["fastai","huggingface","pytorch","pytorch-implementation","pytorch-lightning"],"latest_commit_sha":null,"homepage":"","language":"Jupyter Notebook","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/cccntu.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null}},"created_at":"2023-02-17T14:58:07.000Z","updated_at":"2025-03-12T12:28:15.000Z","dependencies_parsed_at":null,"dependency_job_id":"1d497851-f82b-4da2-ade2-31ac2bc7e84a","html_url":"https://github.com/cccntu/minLoRA","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/cccntu%2FminLoRA","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/cccntu%2FminLoRA/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/cccntu%2FminLoRA/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/cccntu%2FminLoRA/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/cccntu","download_url":"https://codeload.github.com/cccntu/minLoRA/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":250100510,"owners_count":21374960,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["fastai","huggingface","pytorch","pytorch-implementation","pytorch-lightning"],"created_at":"2024-08-02T01:02:46.389Z","updated_at":"2025-04-21T17:31:42.560Z","avatar_url":"https://github.com/cccntu.png","language":"Jupyter Notebook","readme":"# minLoRA\n\n\nA minimal, but versatile PyTorch re-implementation of [LoRA](https://github.com/microsoft/LoRA). In only ~100 lines of code, minLoRA supports the following features:\n\n### Features\n\n- Functional, no need to modify the model definition\n- Works everywhere, as long as you use `torch.nn.Module`\n- PyTorch native, uses PyTorch's `torch.nn.utils.parametrize` to do all the heavy lifting\n- Easily extendable, you can add your own LoRA parameterization\n- Supports training, inference, and inference with multiple LoRA models\n\n## Demo\n\n- `demo.ipynb` shows the basic usage of the library\n- `advanced_usage.ipynb` shows how you can add LoRA to other layers such as embedding, and how to tie weights\n\n## Examples\n\n- Finetuning GPT using LoRA + nanoGPT: https://github.com/cccntu/LoRAnanoGPT/pull/1/files\n\n## Library Installation\n\nIf you want to `import minlora` into your project:\n\n```\ngit clone https://github.com/cccntu/minLoRA.git\ncd minLoRA\npip install -e .\n```\n\n## Usage\n\n```python\nimport torch\nfrom minlora import add_lora, apply_to_lora, disable_lora, enable_lora, get_lora_params, merge_lora, name_is_lora, remove_lora, load_multiple_lora, select_lora\n```\n\n### Training a model with minLoRA\n\n```python\nmodel = torch.nn.Linear(in_features=5, out_features=3)\n# Step 1: Add LoRA to the model\nadd_lora(model)\n\n# Step 2: Collect the parameters, pass them to the optimizer\n\nparameters = [\n    {\"params\": list(get_lora_params(model))},\n]\noptimizer = torch.optim.AdamW(parameters, lr=1e-3)\n\n# Step 3: Train the model\n# ...\n\n# Step 4: export the LoRA parameters\nlora_state_dict = get_lora_state_dict(model)\n```\n\n### Loading and Inferencing with minLoRA\n\n```python\n# Step 1: Add LoRA to your model\nadd_lora(model)\n\n# Step 2: Load the LoRA parameters\n_ = model.load_state_dict(lora_state_dict, strict=False)\n\n# Step 3: Merge the LoRA parameters into the model\nmerge_lora(model)\n```\n\n### Inferencing with multiple LoRA models\n\n```python\n# to avoid re-adding lora to the model when rerun the cell, remove lora first\nremove_lora(model)\n# Step 1: Add LoRA to your model\nadd_lora(model)\n\n# Step 2: Load the LoRA parameters\n\n# load three sets of LoRA parameters\nlora_state_dicts = [lora_state_dict_0, lora_state_dict_1, lora_state_dict_2]\n\nload_multiple_lora(model, lora_state_dicts)\n\n\n# Step 3: Select which LoRA to use at inference time\nY0 = select_lora(model, 0)(x)\nY1 = select_lora(model, 1)(x)\nY2 = select_lora(model, 2)(x)\n```\n### References\n\n- [microsoft/LoRA](https://github.com/microsoft/LoRA) has the official implementation of LoRA, in PyTorch\n- [karpathy/minGPT](https://github.com/karpathy/minGPT) the structure of the repo is adapted from minGPT\n\n\n### TODO\n- [x] A notebook to show how to configure LoRA parameters\n- [x] Real training \u0026 inference examples\n","funding_links":[],"categories":["Others"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fcccntu%2Fminlora","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fcccntu%2Fminlora","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fcccntu%2Fminlora/lists"}