Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/eliahuhorwitz/mother
Official PyTorch Implementation for the "Model Tree Heritage Recovery" paper.
https://github.com/eliahuhorwitz/mother
deep-learning heritage-recovery huggingface llama llama2 machine-learning model-graph model-tree stable-diffusion
Last synced: 3 days ago
JSON representation
Official PyTorch Implementation for the "Model Tree Heritage Recovery" paper.
- Host: GitHub
- URL: https://github.com/eliahuhorwitz/mother
- Owner: eliahuhorwitz
- License: other
- Created: 2024-05-28T08:58:26.000Z (7 months ago)
- Default Branch: main
- Last Pushed: 2024-07-07T08:26:00.000Z (6 months ago)
- Last Synced: 2024-10-30T00:52:01.451Z (about 2 months ago)
- Topics: deep-learning, heritage-recovery, huggingface, llama, llama2, machine-learning, model-graph, model-tree, stable-diffusion
- Language: Python
- Homepage: https://vision.huji.ac.il/mother/
- Size: 2.51 MB
- Stars: 55
- Watchers: 4
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Model Tree Heritage Recovery
Official PyTorch Implementation for the "Model Tree Heritage Recovery" paper.![](imgs/header.gif)
Our proposed *Model Graphs* and *Model Trees* are new data structures for describing the heredity training relations between models.
In these structures, heredity relations are represented as directed edges.
We introduce the task of *Model Tree Heritage Recovery* (MoTHer Recovery), its goal is to uncover the
unknown structure of Model Graphs based on the weights of a set of input models.
___## Installation
1. Clone the repo:
```bash
git clone https://github.com/eliahuhorwitz/MoTHer.git
cd MoTHer
```
2. Create a new environment and install the libraries:
```bash
python3 -m venv mother_venv
source mother_venv/bin/activate
pip install -r requirements.txt
```## The VTHR Dataset
The ViT Tree Heritage Recovery (VTHR) dataset is a dataset that was created for the purpose of evaluating the MoTHer Recovery task.
The dataset contains three splits: i) FT - Fully fine-tuned models, ii) LoRA-V - ViT models that were fine-tuned with LoRA with varying ranks,
and iii) LoRA-F - ViT models that were fine-tuned with LoRA of rank 16.Each split contains a Model Graph with 105 models in 3 levels of hierarchy and with 5 Model Trees.
All the models for the VTHR dataset are hosted on Hugging Face under the [https://huggingface.co/MoTHer-VTHR](https://huggingface.co/MoTHer-VTHR) organization.
To easily process all the models of a Model Graph, we provide a pickle file per script that contains the
original tree structure and the paths for each model. The pickle files are located in the `dataset` directory.Each of the splits is roughly 30GB in size, there is **no need** to download the dataset in advance, the code will take care of this for you.
## Running MoTHer on the VTHR Dataset
Below are instructions to run MoTHer Recovery on the different splits.
We start by assuming the models are already clustered into the different Model Trees.
We will later discuss how to perform this clustering.### Running on Model Graphs with known model clusters
#### Running on the FT Split
As a first step we need to gather the weight statistics, this is done by running the following command:
```bash
python get_vit_layer_statistics.py
```Once the statistics are gathered, we can run the MoTHer Recovery on the FT split:
```bash
python MoTHer_FullFT.py
```#### Running on the LoRA Splits
For the LoRA-V and LoRA-F splits, there is no need to gather the weight statistics, we can directly run the MoTHer Recovery:
```bash
python MoTHer_LoRA.py
```### Running on Model Graphs with multiple Model Trees
When running on models from different Model Trees (i.e., a Model Graph), running the clustering is needed.
We provide a script that shows the clustering accuracy for both the LoRA-V and the FT splits.
You can change the 'LORA' flag to switch between the two splits.```bash
python clustering.py
```## Running MoTHer on in-the-wild Models
For Running MoTHer on Llama2 or Stable Diffusion you can run the tests in the `test_llama_and_sd.py` file.
for llama2 run the `test_llama()` test and for Stable Diffusion run the `test_SD()` test.## Citation
If you find this useful for your research, please use the following.```
@article{horwitz2024origin,
title={On the Origin of Llamas: Model Tree Heritage Recovery},
author={Horwitz, Eliahu and Shul, Asaf and Hoshen, Yedid},
journal={arXiv preprint arXiv:2405.18432},
year={2024}
}
```## Acknowledgments
- The project makes extensive use of the different Hugging Face libraries (e.g. [Diffusers](https://huggingface.co/docs/diffusers/en/index), [PEFT](https://huggingface.co/docs/peft/en/index), [Transformers](https://huggingface.co/docs/transformers/en/index)).