Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/KerryDRX/ESCORT
Official implementation of Prompt-Based Exemplar Super-Compression and Regeneration for Class-Incremental Learning.
https://github.com/KerryDRX/ESCORT
Last synced: 13 days ago
JSON representation
Official implementation of Prompt-Based Exemplar Super-Compression and Regeneration for Class-Incremental Learning.
- Host: GitHub
- URL: https://github.com/KerryDRX/ESCORT
- Owner: KerryDRX
- Created: 2023-11-25T17:23:38.000Z (12 months ago)
- Default Branch: main
- Last Pushed: 2023-11-25T17:32:09.000Z (12 months ago)
- Last Synced: 2024-08-01T18:33:36.911Z (3 months ago)
- Language: Python
- Size: 2.01 MB
- Stars: 6
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
- awesome-diffusion-categorized - [Code
README
# ESCORT: Prompt-Based Exemplar Super-Compression and Regeneration for Class-Incremental Learning
## Overview
Class-incremental learning (CIL) with _**E**xemplar **S**uper-**CO**mpression and **R**egeneration based on promp**T**s_ (ESCORT), a diffusion-based approach that boosts CIL performance by storing exemplars with increased quantity and enhanced diversity under limited memory budget. ESCORT works by
- extracting visual and textual prompts from selected images and saving prompts instead of images
- regenerating exemplars from prompts with ControlNet for CIL model training in subsequent phases## Paper
Official implementation ofPrompt-Based Exemplar Super-Compression and Regeneration for Class-Incremental Learning
[Ruxiao Duan](https://scholar.google.com/citations?hl=en&user=aG-fi1cAAAAJ)1,
[Yaoyao Liu](https://scholar.google.com/citations?hl=en&user=Uf9GqRsAAAAJ)1,
[Jieneng Chen](https://scholar.google.com/citations?hl=en&user=yLYj88sAAAAJ)1,
[Adam Kortylewski](https://scholar.google.com/citations?hl=en&user=tRLUOBIAAAAJ)2,3,
[Alan Yuille](https://scholar.google.com/citations?hl=en&user=FJ-huxgAAAAJ)1,
1Johns Hopkins University,
2University of Freiburg,
3Max Planck Institute for Informatics## Getting Started
### 1. Environment Creation
```
git clone https://github.com/lllyasviel/ControlNet.git
conda env create -f ControlNet/environment.yaml
conda activate control
```### 2. ControlNet Preparation
- Download ControlNet from [HuggingFace](https://huggingface.co/lllyasviel/ControlNet/tree/main/models) and save the Canny edge model (`control_sd15_canny.pth`) in `ControlNet/models/control_sd15_canny.pth`.### 3. Data Curation
- Choose an image classification dataset and save all images in the form of
`////`
- ``: directory to store all datasets.
- ``: name of the dataset.
- ``: either "train" or "test".
- ``: the class tag, e.g., "cupcakes".
- ``: the image filename, e.g., "0001.jpg".
- If the dataset is not implemented, add it to the scripts.
- In `utils/data.py`, set another class by specifying the dataset name and its number of categories.
- In `utils/data_manager.py`, add a mapping from the dataset name to its class instance in `_get_idata()`.### 4. Path Configuration
- In `compress/compress.py` and `compress/compress_lowres.py`, set `CNET_DIR` to `/ControlNet`.
- In `compress/compress.py` and `compress/compress_lowres.py`, set `DATA_DIR` to ``.
- In `utils/data.py`, set `self.data_dir` to ``.
- In `utils/toolkit.py`, set return of `output_folder()` as path to the output folder.### 5. Prompt Extraction and Image Generation
- We compress all the training images into edge maps and generate their copies by ControlNet in advance.
- In `compress/compress.py` and `compress/compress_lowres.py`, set `SEEDS` to seeds of your choice, e.g., `range(5)`.- If the images are relatively high-resolution, run
```
python compress/compress.py
```
- If the images are relatively low-resolution, run
```
python compress/compress_lowres.py
```### 6. Training Configuration
- In `exps/config.json` set training parameters. Most configurations follow the convention of [PyCIL](https://github.com/G-U-N/PyCIL), except:
- `augmentation_prob`: the probability of replacing a real image during one of its generated copies during training.
- `augmentations_per_image`: the number of generated copies per image.
- `memory_per_class`: memory budget in units per class.
- `real_per_class`: number of real exemplars per class.
- `syn_per_class`: number of synthetic exemplars per class.
- For instance, if we can set `memory_per_class=20`, `real_per_class=18`, and `syn_per_class=48`.### 7. CIL Model Training
```
python main.py
```## Acknowledgement
The CIL framework is developed based on [PyCIL](https://github.com/G-U-N/PyCIL).