Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/chongjie-si/subspace-tuning

A generalized framework for subspace tuning methods in parameter efficient fine-tuning.
https://github.com/chongjie-si/subspace-tuning

adapter commonsense-reasoning glue llama llama2-7b llama3-8b lora lora-dash low-rank-adaptation natural-language-generation natural-language-processing natural-language-understanding parameter-efficient-fine-tuning pretrained-models soft-prompt-tuning subject-driven-generation subspace-tuning

Last synced: 2 days ago
JSON representation

A generalized framework for subspace tuning methods in parameter efficient fine-tuning.

Awesome Lists containing this project

README

        



Logo

Β 


A Generalized Framework of Subspace Tuning for PEFT

Β 



License


GitHub Issues or Pull Requests


GitHub Repo stars


πŸ“˜ Introduction |
πŸ’₯ News |
πŸ› οΈ Usage |
🎯 Tasks |
πŸ” Algorithms |
πŸ€” Reporting Issues |
πŸ“§ Contact Us

## πŸ“˜ Introduction

Welcome to our repository, which contains a diverse collection of Subspace Tuning methods for Parameter-Efficient Fine-Tuning (PEFT). Subspace Tuning are essential for adapting large pre-trained models to specific tasks with minimal changes to the original parameters. It endeavors to identify the maximal projection of the optimal weight $\mathbf{W}^{*}$ onto the subspace spanned by the bases of $\phi(\mathbf{W})$, where $\phi(\mathbf{W})$ denotes the subspace transformation of the original frozen weight $\mathbf{W}$. For more details, please refer to [the original paper](https://arxiv.org/abs/2407.05417).

![Framework](./resources/framework.png)

We aim to provide a comprehensive resource for researchers and practitioners in this field, and facilitate easy integration into your projects. Whether you are here to find resources for your projects or to contribute, we hope this repository will be a valuable and inspiring part of your research journey.

### Information Box

This repository also contains some of the other projects we have worked on, which might have led you here.

- [**LoRA-Dash**](https://chongjiesi.site/full-publications/2024-arxiv-lora-dash/): Unleashing the Power of Task-Specific Directions in Parameter Efficient Fine-tuning.
- [**FLoRA**](https://chongjiesi.site/full-publications/2024-arxiv-flora/): FLoRA: Low-Rank Core Space for N-dimension.

## πŸ’₯ News

- **[2024.09.04]** πŸ”₯πŸ”₯ Add ***Method*** LoRA-Dash and ***Task*** Subject-driven Generation to Our Repo!
- **[2024.08.18]** πŸ”₯πŸ”₯ Add ***Task*** Math Reasoning to Our Repo!
- **[2024.07.22]** πŸ”₯πŸ”₯ Add ***Methods*** PISSA, MiLoRA and Spectral Adapter to Our Repo!
- **[2024.07.09]** πŸ”₯πŸ”₯ Repository Constructed!

## πŸ“ Todo List

- Nothing to do yet.

## πŸ› οΈ Usage

To use the algorithms in this repository, clone the repository and install the necessary dependencies.

1. Clone this Repository:

```bash
git clone https://github.com/Chongjie-Si/Subspace-Tuning.git
cd Subspace-Tuning
```

2. Follow the Instructions in Each Folder.

## 🎯 Tasks

We support several tasks including:

- Natural Language Understanding ([NLU](./NLU/))
- Natural Language Generation ([NLG](./NLG_QA/))
- Question Answering ([QA](./NLG_QA/))
- Commonsense Reasoning ([CR](./CR_MR/))
- Math Reasoning ([MR](./CR_MR/))
- Subject-driven Generation ([SdG](./SdG/))
- ...

## πŸ” Algorithms

Based on subspace tuning theory, PEFT methods are classified into three categories: reconstruction-based, extension-based and combination-based.

![Method](./resources/method.png)

We implement different methods mainly in [loralib/](./loralib/loralib/).



Category
Algorithm
Code
Paper







Reconstruction
SAM-PARSER
Code
2024 AAAI


IA3
Code
2022 NeurIPS


SSB
Code
2024 Arxiv


SSL
Code
2024 Arxiv


BitFit
N/A
2022 ACL


Prefix-tuning
Code
2021 ACL


Prompt-tuning
Code
2021 EMNLP


P-tuning
Code
2022 ACL


PISSA
Code
2024 Arxiv


MiLoRA
Code
2024 Arxiv





Extension
LoRA
Code
2022 ICLR


AdaLoRA
Code
2023 ICLR


FLoRA
Code
2024 Arxiv


MoSLoRA
Code
2024 Arxiv


TriLoRA
Code
2024 Arxiv


Adapter (Houlsby)
N/A
2019 ICML


Adapter (Pfeiffer)
N/A
2021 ACL


Parallel Adapter
Code
2022 ICLR




Combination
DoRA
Code
2024 ICML


SVDiff
Code
2023 ICCV


Spectral Adapter
Code
2024 Arxiv


LoRA-Dash
Code
2024 Arxiv


More algorithms and updates are continually added...
N/A
N/A

We have also tested the performance of some algorithms on NLU and [CR](./Fair_Comparison/) tasks.

![result](./resources/result.png)

## 🎁 Contribution

We welcome contributions to this repository! Whether you’re fixing bugs, adding new features, or improving documentation, your help is appreciated. Please follow the [guidelines](./resources/Contributions.md) to ensure a smooth contribution process.

## πŸ’‘ Further Information

Thank you for your interest in our PEFT code repository. We strive to make this a valuable resource for your projects and research endeavors.

Our goal is to foster a collaborative environment where both you and our researchers can exchange ideas and cooperate. Beyond discussing code-related issues, we encourage you to share your perspectives on any PEFT methodology and address any potential challenges you encounter. We welcome discussions that may spark new insights and innovations.

Besides, this code repository is of a more private nature, containing tasks and algorithms that I use during my experiments.
If you have any algorithms you’d like to implement or wish to add more task scenarios, please feel free to send [email](mailto:[email protected]) to me. You can also visit my [personal homepage](https://chongjiesi.github.io) for more details.

## πŸ“§ Contact

If you have any questions, suggestions, or feedback, please feel free to contact us at [[email protected]](mailto:[email protected]).

## πŸ”— Citation

If you find this repository useful, please consider giving it a star and citing it in your work:

```bibtex
@article{si2024see,
title={See Further for Parameter Efficient Fine-tuning by Standing on the Shoulders of Decomposition},
author={Si, Chongjie and Yang, Xiaokang and Shen, Wei},
journal={arXiv preprint arXiv:2407.05417},
year={2024}
}
```

**This repository also contains the code for our other projects. If you find these methods useful, please consider giving a star and citing them in your work.**

Other Projects

### FLoRA: Low-Rank Core Space for N-dimension

```bibtex
@article{si2024flora,
title={FLoRA: Low-Rank Core Space for N-dimension},
author={Si, Chongjie* and Wang, Xuehui* and Yang, Xue and Xu, Zhengqin and Li, Qingyun and Dai, Jifeng and Qiao, Yu and Yang, Xiaokang and Shen, Wei},
journal={arXiv preprint arXiv:2405.14739},
year={2024}
}
```

### Unleashing the Power of Task-Specific Directions in Parameter Efficient Fine-tuning

```bibtex
@article{si2024unleashing,
title={Unleashing the Power of Task-Specific Directions in Parameter Efficient Fine-tuning},
author={Si, Chongjie* and Shi, Zhiyi* and Zhang, Shifan and Yang, Xiaokang and Pfister, Hanspeter and Shen, Wei},
journal={arXiv preprint arXiv:2409.01035},
year={2024}
}
```

## πŸ“„ License

This repository is licensed under the [Apache 2.0 license](./LICENSE). See the LICENSE file for more details.