{"id":17313002,"url":"https://github.com/mattdl/continualprototypeevolution","last_synced_at":"2025-07-15T12:43:16.091Z","repository":{"id":56728359,"uuid":"339505648","full_name":"Mattdl/ContinualPrototypeEvolution","owner":"Mattdl","description":"Codebase for Continual Prototype Evolution (CoPE) to attain perpetually representative prototypes for online and non-stationary datastreams. Includes implementation of the Pseudo-Prototypical Proxy (PPP) loss.","archived":false,"fork":false,"pushed_at":"2022-03-21T08:39:27.000Z","size":165,"stargazers_count":44,"open_issues_count":1,"forks_count":8,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-06-16T07:50:28.610Z","etag":null,"topics":["computer-vision","continual-learning","datastreaming","deep-learning","online-learning","prototypes","representation-learning"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"other","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/Mattdl.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2021-02-16T19:21:13.000Z","updated_at":"2025-01-16T07:15:19.000Z","dependencies_parsed_at":"2022-08-16T00:40:10.619Z","dependency_job_id":null,"html_url":"https://github.com/Mattdl/ContinualPrototypeEvolution","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/Mattdl/ContinualPrototypeEvolution","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Mattdl%2FContinualPrototypeEvolution","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Mattdl%2FContinualPrototypeEvolution/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Mattdl%2FContinualPrototypeEvolution/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Mattdl%2FContinualPrototypeEvolution/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/Mattdl","download_url":"https://codeload.github.com/Mattdl/ContinualPrototypeEvolution/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Mattdl%2FContinualPrototypeEvolution/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":265437196,"owners_count":23765118,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["computer-vision","continual-learning","datastreaming","deep-learning","online-learning","prototypes","representation-learning"],"created_at":"2024-10-15T12:45:21.710Z","updated_at":"2025-07-15T12:43:16.073Z","avatar_url":"https://github.com/Mattdl.png","language":"Python","readme":"# Continual Prototype Evolution (CoPE)\n\nContinual Prototype Evolution (CoPE) establishes online adaptation of class-representative prototypes in non-stationary data streams, exploiting latent space representations in the novel PPP-loss to enhance the state-of-the-art in continual learning.\n\nThis codebase contains the original PyTorch implementation of CoPE, along with the Split-MNIST, Split-CIFAR10, Split-CIFAR100 benchmarks.\nThe benchmarks have both a balanced and highly imbalanced variant, resembling more real-life settings.\nIncluded baselines outperformed in these settings are: CoPE-CrossEntropy, GEM, iCaRL, GSS, reservoir sampling, finetuning, online iid, offline iid.\n\n- This work is accepted at the International Conference on Computer Vision (ICCV) 2021.\n- Resources: **[Open-Access paper @ICCV 2021](https://openaccess.thecvf.com/content/ICCV2021/html/De_Lange_Continual_Prototype_Evolution_Learning_Online_From_Non-Stationary_Data_Streams_ICCV_2021_paper.html)** | **[Supplemental Materials](https://openaccess.thecvf.com/content/ICCV2021/supplemental/De_Lange_Continual_Prototype_Evolution_ICCV_2021_supplemental.pdf)**\n\n\n\u003cimg src=\"CoPE_fig.png\" width=\"800\"\u003e\n\n**Keywords**: continual learning, prototypical learning, online learning, incremental learning, deep learning, representation learning, catastrophic forgetting, concept drift\n\n\n## Results\nMain scripts [main_MNIST.sh](main_MNIST.sh), [main_CIFAR10.sh](main_CIFAR10.sh), [main_CIFAR100.sh](main_CIFAR100.sh)\ncontain fully automatic pipeline (auto datapreparation), with hyperparameter configs for all of the experiments in the main paper.\n\nThe **balanced** setups contain: \n- Split-MNIST, Split-CIFAR10, Split-CIFAR100 and \n- lower capacity benchmarks Split-MNIST-mini and Split-CIFAR10-mini.\n\nThe **imbalanced** setups contain (averaged over 5 different choices of dominant task):\n- Imbalanced Split-MNIST: 1 task 2k samples, others 0.2k (5 tasks)\n- Imbalanced Split-CIFAR10: 1 task 4k samples, others 0.4k (5 tasks)\n- Imbalanced Split-CIFAR100: 1 task 2k samples, others 1k (20 tasks)\n\n## Requirements\n- Python 3.7\n- Pytorch 1.5 ([instructions](https://pytorch.org/get-started/previous-versions/#v150))\n- To install dependencies:\n    - Use [environment.yml](environment.yml) to create anaconda environment:\n        \n            conda env create -f environment.yml         # Env named 'cope'\n            conda activate cope\n    - Or manually, as in:\n    \n            # Create and activate environment\n            conda create -n \u003cname\u003e python=3.7\n            conda activate \u003cname\u003e\n\n            # Pytorch (e.g. for CUDA 10.2)\n            conda install pytorch==1.5.0 torchvision==0.6.0 cudatoolkit=10.2 -c pytorch\n\n            # Optional\n            conda install -c conda-forge matplotlib=3.1.3       # T-SNE plots\n            conda install -c conda-forge scikit-learn=0.22.1\n            conda install -c omnia quadprog                     # GEM baseline\n\n## Reproducing paper results\nThis final code-base is validated to produce similar results to the original results reported in the paper.\n- To avoid issues, use the *exact* dependency requirements defined above.\n- Original implementation doesn't re-normalize prototypes after momentum-update. Doing this slightly decreases average accuracy.\n- Final code-base results after cleanup are checked. Avg. accuracy balanced benchmarks: 94.11+-0.76 (MNIST 5 seeds), 49.61+-3.44 (CIFAR10 5 seeds), 20.51 (CIFAR100 1 seed). \nReport an issue or contact me if you have troubles reproducing these.\n\n## Online Data incremental learning\nAlthough the data streams are divided into tasks to compare with task and class-incremental learning alorithms (iCaRL, GEM),\nin CoPE the continual learner is unaware of tasks or task transitions.\nThis means CoPE can learn from any labeled data stream, without the bias of hand-designed task boundaries within the stream.\n\n\n## Learner-evaluator framework\nThe learner-evaluator framework defined in the [paper](https://arxiv.org/pdf/2009.00919.pdf), explicitly models all the \nrequirements of the continual learning system. \n\nWe define the **learner** here for CoPE:\n- The horizon = the currently observed batch (online processing)\n- The operational memory = replay memory + prototypical memory\n\nWith the **evaluator**:\n- Periodicity (rho) = evaluating on task transitions\n- Eval distribution = static class distributions, evaluate on observed classes in learner\n\n\n## Credits\n- Consider citing our work upon using this repo.\n\n        @InProceedings{De_Lange_2021_ICCV,\n            author    = {De Lange, Matthias and Tuytelaars, Tinne},\n            title     = {Continual Prototype Evolution: Learning Online From Non-Stationary Data Streams},\n            booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},\n            month     = {October},year = {2021}, pages = {8250-8259}\n        }\n- CoPE has been made available in the [Avalanche framework](https://github.com/ContinualAI/avalanche/) (free to use under MIT license)!\n- Thanks to the following repositories:\n    - https://github.com/facebookresearch/GradientEpisodicMemory\n    - https://github.com/rahafaljundi/Gradient-based-Sample-Selection\n    - https://github.com/Mattdl/CLsurvey\n\nThis source code is released under a Attribution-NonCommercial 4.0 International license, find out more about it in the [LICENSE file](LICENSE).\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmattdl%2Fcontinualprototypeevolution","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fmattdl%2Fcontinualprototypeevolution","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmattdl%2Fcontinualprototypeevolution/lists"}