Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/universome/class-norm
Class Normalization for Continual Zero-Shot Learning
https://github.com/universome/class-norm
continual-learning iclr2021 initialization lifelong-learning normalization zero-shot-learning
Last synced: 3 months ago
JSON representation
Class Normalization for Continual Zero-Shot Learning
- Host: GitHub
- URL: https://github.com/universome/class-norm
- Owner: universome
- Created: 2019-10-02T10:02:43.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2021-02-05T11:37:36.000Z (about 4 years ago)
- Last Synced: 2024-08-13T20:17:17.212Z (6 months ago)
- Topics: continual-learning, iclr2021, initialization, lifelong-learning, normalization, zero-shot-learning
- Language: Python
- Homepage:
- Size: 70.9 MB
- Stars: 34
- Watchers: 5
- Forks: 3
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# About
This repo contains the code for the [Class Normalization for Continual Zero-Shot Learning paper](https://arxiv.org/abs/2006.11328) from ICLR 2021:
- the code to reproduce ZSL and CZSL results
- the proposed CZSL metrics (located in `src/utils/metrics.py`)
- fast python implementation of the AUSUC metric[arXiv Paper]
[Google Colab]
[OpenReview Paper]In this project, we explored different normalization strategies used in ZSL and proposed a new one (class normalization) that is suited for deep attribute embedders.
This allowed us to outperform the existing ZSL model with a simple 3-layer MLP trained just in 30 seconds.
Also, we extended ZSL ideas into a more generalized setting: Continual Zero-Shot Learning, proposed a set of metrics for it and tested several baselines.![]()
# Installation & training
## Data preparation
### For ZSL
For ZSL, we tested our method on the standard GBU datasets which you can download from [the original website](https://www.mpi-inf.mpg.de/departments/computer-vision-and-machine-learning/research/zero-shot-learning/zero-shot-learning-the-good-the-bad-and-the-ugly).
It is the easiest to follow our [Google Colab](class-norm-for-czsl.ipynb) to reproduce the results.### For CZSL
For CZSL, we tested our method on SUN and CUB datasets.
In contrast to ZSL, in CZSL we used raw images as inputs instead of an ImageNet-pretrained model's features.
For CUB, please follow the instructions in the [A-GEM repo](https://github.com/facebookresearch/agem). Note, that CUB images dataset are now to be downloaded manually from [here](http://www.vision.caltech.edu/visipedia/CUB-200-2011.html), but we used the same splits as A-GEM.
Put the A-GEM splits into the CUB data folder.For SUN, download the data from the [official website](https://cs.brown.edu/~gmpatter/sunattributes.html), put it under `data/SUN` and then follow the instructions in [scripts/sun_data_preprocessing.py](scripts/sun_data_preprocessing.py)
## Installing the `firelab` dependency
You will need to install [firelab library](https://github.com/universome/firelab) to run the training:
```
pip install firelab
```## Running ZSL training
Please, refer to this [Google Colab notebook](class-norm-for-czsl.ipynb): it contains the code to reproduce our results.## Running CZSL training
To run CZSL training you will need to run the command:
```
python src/run.py -c basic|agem|mas|joint -d cub|sun
```
Please note, that by default we load all the data into memory (to speed up things).
This behaviour is controled by the `in_memory` flag in the config.# Results
## Zero-shot learning results![]()
## Continual Zero-Shot Learning results
![]()
## Training speed results for ZSL
![]()