Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/grypesc/seed
ICLR2024 paper on Continual Learning
https://github.com/grypesc/seed
class-incremental class-incremental-learning computer-vision continual-learning facil machine-learning
Last synced: 10 days ago
JSON representation
ICLR2024 paper on Continual Learning
- Host: GitHub
- URL: https://github.com/grypesc/seed
- Owner: grypesc
- License: mit
- Created: 2023-12-18T11:37:48.000Z (11 months ago)
- Default Branch: main
- Last Pushed: 2024-04-21T19:44:38.000Z (7 months ago)
- Last Synced: 2024-04-22T00:56:40.452Z (7 months ago)
- Topics: class-incremental, class-incremental-learning, computer-vision, continual-learning, facil, machine-learning
- Language: Python
- Homepage: https://arxiv.org/abs/2401.10191
- Size: 427 KB
- Stars: 19
- Watchers: 2
- Forks: 3
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Divide and not forget: Ensemble of selectively trained experts in Continual Learning: ICLR2024 (Main track)
https://arxiv.org/abs/2401.10191
https://openreview.net/forum?id=sSyytcewxe![image](inference.jpg?raw=true "inference")
This repository contains code for the SEED paper published at the main track of ICLR2024. It is based on FACIL (https://github.com/mmasana/FACIL) benchmark.
To reproduce results run one of provided scripts.Setup environment according to readme of FACIL.
Run SEED on CIFAR100 10 tasks, 10 classes each:
```bash
bash cifar10x10.sh
```Run SEED on CIFAR100 20 tasks, 5 classes each:
```bash
bash cifar20x5.sh
```Run SEED on CIFAR100 50 tasks, 2 classes each:
```bash
bash cifar50x2.sh
```
To lower the number of parameters as in Tab.5 use ```--network resnet 20 --shared 2```. You can also add parameter pruning as in DER.To reproduce results for ImageNet Subset download ImageNet subset from https://www.kaggle.com/datasets/arjunashok33/imagenet-subset-for-inc-learn and unzip it in ```../data``` directory.
```bash
bash imagenet10x10.sh
```To reproduce results for DomainNet download it from http://ai.bu.edu/M3SDA/ and put it in ```../data/domainnet``` directory (unzip it).
Run SEED on DomainNet 36 tasks of different domains, 5 classes each:
```bash
bash domainnet36x5.sh
```
You can add ```--extra-aug fetril``` flag to enable better augmentations.If you would like to cooperate on improving the method, please contact me via LinkedIn or Facebook, I have several ideas.
If you find this work useful, please consider citing it:
```
@inproceedings{rypesc2023divide,
title={Divide and not forget: Ensemble of selectively trained experts in Continual Learning},
author={Rype{\'s}{\'c}, Grzegorz and Cygert, Sebastian and Khan, Valeriya and Trzcinski, Tomasz and Zieli{\'n}ski, Bartosz Micha{\l} and Twardowski, Bart{\l}omiej},
booktitle={The Twelfth International Conference on Learning Representations},
year={2023}
}
```