Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/zhangyuanhan-ai/noah
[TPAMI] Searching prompt modules for parameter-efficient transfer learning.
https://github.com/zhangyuanhan-ai/noah
deep-learning domain-generalization pre-trained-model prompt-tuning pytorch transfer-learning visual-prompting
Last synced: 19 days ago
JSON representation
[TPAMI] Searching prompt modules for parameter-efficient transfer learning.
- Host: GitHub
- URL: https://github.com/zhangyuanhan-ai/noah
- Owner: ZhangYuanhan-AI
- License: mit
- Created: 2022-06-06T06:29:20.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2023-12-08T03:25:30.000Z (11 months ago)
- Last Synced: 2024-10-12T04:54:37.633Z (about 1 month ago)
- Topics: deep-learning, domain-generalization, pre-trained-model, prompt-tuning, pytorch, transfer-learning, visual-prompting
- Language: Python
- Homepage:
- Size: 43.2 MB
- Stars: 220
- Watchers: 2
- Forks: 11
- Open Issues: 6
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
Neural Prompt Search
S-Lab, Nanyang Technological UniversityTL;DR
The idea is simple: we view existing parameter-efficient tuning modules, including [Adapter](https://arxiv.org/abs/1902.00751), [LoRA](https://arxiv.org/abs/2106.09685) and [VPT](https://arxiv.org/abs/2203.12119), as prompt modules and propose to search the optimal configuration via neural architecture search. Our approach is named **NOAH** (Neural prOmpt seArcH).---
## Updatas
[05/2022] [arXiv](https://arxiv.org/abs/2206.04673) paper has been **released**.## Environment Setup
```
conda create -n NOAH python=3.8
conda activate NOAH
pip install -r requirements.txt
```## Data Preparation
### 1. Visual Task Adaptation Benchmark (VTAB)
```
cd data/vtab-source
python get_vtab1k.py
```### 2. Few-Shot and Domain Generation
- Images
Please refer to [DATASETS.md](https://github.com/KaiyangZhou/CoOp/blob/main/DATASETS.md) to download the datasets.
- Train/Val/Test splits
Please refer to files under `data/XXX/XXX/annotations` for the detail information.
## Quick Start For NOAH
We use the VTAB experiments as examples.### 1. Downloading the Pre-trained Model
| Model | Link |
|-------|------|
|ViT B/16 | [link](https://storage.googleapis.com/vit_models/imagenet21k/ViT-B_16.npz)|### 2. Supernet Training
```
sh configs/NOAH/VTAB/supernet/slurm_train_vtab.sh PATH-TO-YOUR-PRETRAINED-MODEL
```### 3. Subnet Search
```
sh configs/NOAH/VTAB/search/slurm_search_vtab.sh PARAMETERS-LIMITES
```
### 4. Subnet Retraining
```
sh configs/NOAH/VTAB/subnet/slurm_retrain_vtab.sh PATH-TO-YOUR-PRETRAINED-MODEL
```
We add the optimal subnet architecture of each dataset in the ``experiments/NOAH/subnet/VTAB``.### 5. Performance
![fig1](figures/table1.jpg)## Citation
If you use this code in your research, please kindly cite this work.
```
@misc{zhang2022neural,
title={Neural Prompt Search},
author={Yuanhan Zhang and Kaiyang Zhou and Ziwei Liu},
year={2022},
eprint={2206.04673},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```## Acknoledgments
Part of the code is borrowed from [CoOp](https://github.com/KaiyangZhou/CoOp), [AutoFormer](https://github.com/microsoft/Cream/tree/main/AutoFormer), [timm](https://github.com/rwightman/pytorch-image-models) and [mmcv](https://github.com/open-mmlab/mmcv).Thanks to Chong Zhou (https://chongzhou96.github.io/) for the code of downloading the VTAB-1k.
[![Hits](https://hits.seeyoufarm.com/api/count/incr/badge.svg?url=https%3A%2F%2Fgithub.com%2FZhangYuanhan-AI%2FNOAH&count_bg=%2379C83D&title_bg=%23555555&icon=&icon_color=%23E7E7E7&title=visitors&edge_flat=false)](https://hits.seeyoufarm.com)