Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/floodsung/LearningToCompare_FSL
PyTorch code for CVPR 2018 paper: Learning to Compare: Relation Network for Few-Shot Learning (Few-Shot Learning part)
https://github.com/floodsung/LearningToCompare_FSL
few-shot-learning meta-learning
Last synced: about 2 months ago
JSON representation
PyTorch code for CVPR 2018 paper: Learning to Compare: Relation Network for Few-Shot Learning (Few-Shot Learning part)
- Host: GitHub
- URL: https://github.com/floodsung/LearningToCompare_FSL
- Owner: floodsung
- License: mit
- Created: 2018-03-28T12:14:14.000Z (over 6 years ago)
- Default Branch: master
- Last Pushed: 2019-10-22T03:19:44.000Z (about 5 years ago)
- Last Synced: 2024-10-15T11:03:04.640Z (about 2 months ago)
- Topics: few-shot-learning, meta-learning
- Language: Python
- Size: 8.19 MB
- Stars: 1,045
- Watchers: 32
- Forks: 268
- Open Issues: 33
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-few-shot-meta-learning - code - official (PyTorch fewshot)
README
# LearningToCompare_FSL
PyTorch code for CVPR 2018 paper: [Learning to Compare: Relation Network for Few-Shot Learning](https://arxiv.org/abs/1711.06025) (Few-Shot Learning part)For Zero-Shot Learning part, please visit [here](https://github.com/lzrobots/LearningToCompare_ZSL).
# Requirements
Python 2.7
Pytorch 0.3
# Data
For Omniglot experiments, I directly attach omniglot 28x28 resized images in the git, which is created based on [omniglot](https://github.com/brendenlake/omniglot) and [maml](https://github.com/cbfinn/maml).
For mini-Imagenet experiments, please download [mini-Imagenet](https://drive.google.com/open?id=0B3Irx3uQNoBMQ1FlNXJsZUdYWEE) and put it in ./datas/mini-Imagenet and run proc_image.py to preprocess generate train/val/test datasets. (This process method is based on [maml](https://github.com/cbfinn/maml)).
# Train
omniglot 5way 1 shot:
```
python omniglot_train_one_shot.py -w 5 -s 1 -b 19
```omniglot 5way 5 shot:
```
python omniglot_train_few_shot.py -w 5 -s 5 -b 15
```omniglot 20way 1 shot:
```
python omniglot_train_one_shot.py -w 20 -s 1 -b 10
```omniglot 20way 5 shot:
```
python omniglot_train_few_shot.py -w 20 -s 5 -b 5
```mini-Imagenet 5 way 1 shot:
```
python miniimagenet_train_one_shot.py -w 5 -s 1 -b 15
```mini-Imagenet 5 way 5 shot:
```
python miniimagenet_train_few_shot.py -w 5 -s 5 -b 10
```you can change -b parameter based on your GPU memory. Currently It will load my trained model, if you want to train from scratch, you can delete models by yourself.
## Test
omniglot 5way 1 shot:
```
python omniglot_test_one_shot.py -w 5 -s 1
```Other experiments' testings are similar.
## Citing
If you use this code in your research, please use the following BibTeX entry.
```
@inproceedings{sung2018learning,
title={Learning to Compare: Relation Network for Few-Shot Learning},
author={Sung, Flood and Yang, Yongxin and Zhang, Li and Xiang, Tao and Torr, Philip HS and Hospedales, Timothy M},
booktitle={Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition},
year={2018}
}
```## Reference
[MAML](https://github.com/cbfinn/maml)
[MAML-pytorch](https://github.com/katerakelly/pytorch-maml)