Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/grypesc/adagauss
2024 Neurips paper on Continual Learning and Class Incremental Learning
https://github.com/grypesc/adagauss
catastrophic-forgetting class-incremental class-incremental-learning continual-learning facil lifelong-learning lifelong-machine-learning machine-learning machinelearning
Last synced: about 1 month ago
JSON representation
2024 Neurips paper on Continual Learning and Class Incremental Learning
- Host: GitHub
- URL: https://github.com/grypesc/adagauss
- Owner: grypesc
- Created: 2024-09-26T14:00:43.000Z (3 months ago)
- Default Branch: master
- Last Pushed: 2024-11-08T09:32:21.000Z (about 1 month ago)
- Last Synced: 2024-11-08T10:24:49.597Z (about 1 month ago)
- Topics: catastrophic-forgetting, class-incremental, class-incremental-learning, continual-learning, facil, lifelong-learning, lifelong-machine-learning, machine-learning, machinelearning
- Language: Python
- Homepage: https://arxiv.org/abs/2409.18265
- Size: 1.01 MB
- Stars: 4
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# AdaGauss repository
This repostory contains code for NeurIPS 2024 paper on Continual Learning: **Task-recency bias strikes back: Adapting covariances in Exemplar-Free Class Incremental Learning** (). The repository is based on FACIL benchmark .
We consider exemplar free class incremental scenario, where we revisit the task-recency bias. Unlike previous works, that focused on the biased classification head, we look at the latent space. We show that old class representations have lower ranks than new classes and this is the core of the problem. We solve this issue with anti-collapse loss. Additionally, we are first to adapt covariances on classes from old tasks to the new one.
In our method we train feature extractor on all tasks using: cross-entropy, feature distillation through a neural projector and anti-collapse loss functions. We represent each class as Gaussian distribution in the latent space. After each task we transform these distributions from the old model's latent space to the new using an auxilary neural network (to alleviate semantic drift problem).
![image](images/method.png?raw=true "Adagauss")
### Setup
Create virtual environment and install dependencies:
```bash
python3 -m venv venv && source venv/bin/activate
pip install torch==2.2.0 torchvision==0.17.0 torchaudio==2.2.0 --index-url https://download.pytorch.org/whl/cu118
pip install requirements.txt
```Reproduce experiments using scripts in ```scripts``` directory:
```bash
bash scripts/cifar-10x10.sh
```Feel free to contact me on LinkedIn if you have any questions.