Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/arthurdouillard/incremental_learning.pytorch
A collection of incremental learning paper implementations including PODNet (ECCV20) and Ghost (CVPR-W21).
https://github.com/arthurdouillard/incremental_learning.pytorch
continual-learning deep-learning incremental-learning lifelong-learning pytorch research
Last synced: 16 days ago
JSON representation
A collection of incremental learning paper implementations including PODNet (ECCV20) and Ghost (CVPR-W21).
- Host: GitHub
- URL: https://github.com/arthurdouillard/incremental_learning.pytorch
- Owner: arthurdouillard
- License: mit
- Created: 2019-04-02T19:57:06.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2023-10-03T21:36:06.000Z (about 1 year ago)
- Last Synced: 2024-10-12T04:28:59.654Z (about 1 month ago)
- Topics: continual-learning, deep-learning, incremental-learning, lifelong-learning, pytorch, research
- Language: Python
- Homepage:
- Size: 6.03 MB
- Stars: 384
- Watchers: 15
- Forks: 60
- Open Issues: 8
-
Metadata Files:
- Readme: README.md
- License: LICENSE.txt
- Authors: AUTHORS
Awesome Lists containing this project
- awesome-machine-learning-resources - **[Code Collection
README
# Incremental Learners for Continual Learning
Repository storing some my public works done during my PhD thesis (2019-).
You will find in there both known implementation (iCaRL, etc.) but also all my papers.
You can find the list of the latter on my [Google Scholar](https://scholar.google.com/citations?user=snwgZBIAAAAJ&hl=en).My work on continual segmentation can be found [here](https://github.com/arthurdouillard/CVPR2021_PLOP) and on continual data loaders [here](https://github.com/Continvvm/continuum).
## Structures
Every model must inherit `inclearn.models.base.IncrementalLearner`.
# PODNet: Pooled Outputs Distillation for Small-Tasks Incremental Learning
[![Paper](https://img.shields.io/badge/arXiv-2004.13513-brightgreen)](https://arxiv.org/abs/2004.13513)
![ECCV](https://img.shields.io/badge/ECCV-2020-blue)
[![Youtube](https://img.shields.io/badge/Youtube-link-red)](https://www.youtube.com/watch?v=SWFO1_lTcR8)![podnet](images/podnet.png)
![podnet plot](images/podnet_plot.png)
If you use this paper/code in your research, please consider citing us:
```
@inproceedings{douillard2020podnet,
title={PODNet: Pooled Outputs Distillation for Small-Tasks Incremental Learning},
author={Douillard, Arthur and Cord, Matthieu and Ollion, Charles and Robert, Thomas and Valle, Eduardo},
booktitle={Proceedings of the IEEE European Conference on Computer Vision (ECCV)},
year={2020}
}
```To run experiments on CIFAR100 with three different class orders, with the challenging
setting of 50 steps:```bash
python3 -minclearn --options options/podnet/podnet_cnn_cifar100.yaml options/data/cifar100_3orders.yaml \
--initial-increment 50 --increment 1 --fixed-memory \
--device --label podnet_cnn_cifar100_50steps \
--data-path
```Likewise, for ImageNet100:
```bash
python3 -minclearn --options options/podnet/podnet_cnn_imagenet100.yaml options/data/imagenet100_1order.yaml \
--initial-increment 50 --increment 1 --fixed-memory \
--device --label podnet_cnn_imagenet100_50steps \
--data-path
```And ImageNet1000:
Likewise, for ImageNet100:
```bash
python3 -minclearn --options options/podnet/podnet_cnn_imagenet100.yaml options/data/imagenet1000_1order.yaml \
--initial-increment 500 --increment 50 --fixed-memory --memory-size 20000 \
--device --label podnet_cnn_imagenet1000_10steps \
--data-path
```Furthermore several options files are available to reproduce the ablations showcased
in the paper. Please see the directory `./options/podnet/ablations/`.# Insight From the Future for Continual Learning
[![Paper](https://img.shields.io/badge/arXiv-2006.13748-brightgreen)](https://arxiv.org/abs/2006.13748)
![CVPR Workshop](https://img.shields.io/badge/CVPRW-2021-blue)![ghost](images/ghost.png)
If you use this paper/code in your research, please consider citing us:
```
@inproceedings{douillard2020ghost,
title={Insight From the Future for Continual Learning},
author={Arthur Douillard and Eduardo Valle and Charles Ollion and Thomas Robert and Matthieu Cord},
booktitle={arXiv preprint library},
year={2020}
}
```The code is still very dirty, I'll clean it later. Forgive me.