Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/mmasana/FACIL
Framework for Analysis of Class-Incremental Learning with 12 state-of-the-art methods and 3 baselines.
https://github.com/mmasana/FACIL
continual-learning deep-learning framework incremental-learning lifelong-learning machine-learning reproducible-research survey
Last synced: 3 months ago
JSON representation
Framework for Analysis of Class-Incremental Learning with 12 state-of-the-art methods and 3 baselines.
- Host: GitHub
- URL: https://github.com/mmasana/FACIL
- Owner: mmasana
- License: mit
- Created: 2020-09-22T10:52:32.000Z (about 4 years ago)
- Default Branch: master
- Last Pushed: 2023-05-26T16:28:09.000Z (over 1 year ago)
- Last Synced: 2024-07-22T16:52:19.478Z (4 months ago)
- Topics: continual-learning, deep-learning, framework, incremental-learning, lifelong-learning, machine-learning, reproducible-research, survey
- Language: Python
- Homepage: https://arxiv.org/pdf/2010.15277.pdf
- Size: 7.42 MB
- Stars: 512
- Watchers: 9
- Forks: 98
- Open Issues: 11
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Framework for Analysis of Class-Incremental Learning
---
What is FACIL •
Key Features •
How To Use •
Approaches •
Datasets •
Networks •
License •
Cite---
## What is FACIL
FACIL started as code for the paper:
_**Class-incremental learning: survey and performance evaluation**_
*Marc Masana, Xialei Liu, Bartlomiej Twardowski, Mikel Menta, Andrew D. Bagdanov, Joost van de Weijer*
([TPAMI](https://ieeexplore.ieee.org/abstract/document/9915459)) ([arxiv](https://arxiv.org/abs/2010.15277))It allows to reproduce the results in the paper as well as provide a (hopefully!) helpful framework to develop new
methods for incremental learning and analyse existing ones. Our idea is to expand the available approaches
and tools with the help of the community. To help FACIL grow, don't forget to star this github repository and
share it to friends and coworkers!## Key Features
We provide a framework based on class-incremental learning. However, task-incremental learning is also fully
supported. Experiments by default provide results on both task-aware and task-agnostic evaluation. Furthermore, if an
experiment runs with one task on one dataset, results would be equivalent to 'common' supervised learning.| Setting | task-ID at train time | task-ID at test time | # of tasks |
| ----- | ------------------------- | ------------------------ | ------------ |
| [class-incremental learning](https://ieeexplore.ieee.org/abstract/document/9915459) | yes | no | ≥1 |
| [task-incremental learning](https://ieeexplore.ieee.org/abstract/document/9349197) | yes | yes | ≥1 |
| non-incremental supervised learning | yes | yes | 1 |Current available approaches include:
Finetuning • Freezing • JointLwF • iCaRL • EWC • PathInt • MAS • RWalk • EEIL • LwM • DMC • BiC • LUCIR • IL2M
## How To Use
Clone this github repository:
```
git clone https://github.com/mmasana/FACIL.git
cd FACIL
```Optionally, create an environment to run the code (click to expand).
### Using a requirements file
The library requirements of the code are detailed in [requirements.txt](requirements.txt). You can install them
using pip with:
```
python3 -m pip install -r requirements.txt
```### Using a conda environment
Development environment based on Conda distribution. All dependencies are in `environment.yml` file.#### Create env
To create a new environment check out the repository and type:
```
conda env create --file environment.yml --name FACIL
```
*Notice:* set the appropriate version of your CUDA driver for `cudatoolkit` in `environment.yml`.#### Environment activation/deactivation
```
conda activate FACIL
conda deactivate
```To run the basic code:
```
python3 -u src/main_incremental.py
```
More options are explained in the [`src`](./src), including GridSearch usage. Also, more specific options on approaches,
loggers, datasets and networks.### Scripts
We provide scripts to reproduce the specific scenarios proposed in
_**Class-incremental learning: survey and performance evaluation**_:* CIFAR-100 (10 tasks) with ResNet-32 without exemplars
* CIFAR-100 (10 tasks) with ResNet-32 with fixed and growing memory
* _MORE COMING SOON..._All scripts run 10 times to later calculate mean and standard deviation of the results.
Check out all available in the [scripts](scripts) folder.## License
Please check the MIT license that is listed in this repository.## Cite
If you want to cite the framework feel free to use this preprint citation while we await publication:
```bibtex
@article{masana2022class,
title={Class-Incremental Learning: Survey and Performance Evaluation on Image Classification},
author={Masana, Marc and Liu, Xialei and Twardowski, Bartłomiej and Menta, Mikel and Bagdanov, Andrew D. and van de Weijer, Joost},
journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
doi={10.1109/TPAMI.2022.3213473},
year={2023},
volume={45},
number={5},
pages={5513-5533}}
}
```---
The basis of FACIL is made possible thanks to [Marc Masana](https://github.com/mmasana),
[Xialei Liu](https://github.com/xialeiliu), [Bartlomiej Twardowski](https://github.com/btwardow)
and [Mikel Menta](https://github.com/mkmenta). Code structure is inspired by [HAT](https://github.com/joansj/hat.). Feel free to contribute or propose new features by opening an issue!