Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/eambutu/snail-pytorch
Implementation of "A Simple Neural Attentive Meta-Learner" (SNAIL, https://arxiv.org/pdf/1707.03141.pdf) in PyTorch
https://github.com/eambutu/snail-pytorch
cnn python pytorch snail
Last synced: about 2 months ago
JSON representation
Implementation of "A Simple Neural Attentive Meta-Learner" (SNAIL, https://arxiv.org/pdf/1707.03141.pdf) in PyTorch
- Host: GitHub
- URL: https://github.com/eambutu/snail-pytorch
- Owner: eambutu
- Created: 2018-05-20T16:19:23.000Z (over 6 years ago)
- Default Branch: master
- Last Pushed: 2019-07-06T00:29:24.000Z (over 5 years ago)
- Last Synced: 2024-07-31T23:44:52.586Z (4 months ago)
- Topics: cnn, python, pytorch, snail
- Language: Python
- Homepage:
- Size: 16.6 KB
- Stars: 145
- Watchers: 5
- Forks: 28
- Open Issues: 3
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
- awesome-few-shot-meta-learning - code (PyTorch)
README
# A Simple Neural Attentive Meta-Learner (SNAIL) in PyTorch
An implementation of Simple Neural Attentive Meta-Learner (SNAIL) ([paper](https://arxiv.org/pdf/1707.03141.pdf)) in PyTorch.Much of the boiler plate code for setting up datasets and what not came from a PyTorch implementation of [Prototypical Networks](https://github.com/orobix/Prototypical-Networks-for-Few-shot-Learning-PyTorch/blob/master/README.md).
## Mini-Imagenet Dataset
Follow the instructions here: https://github.com/renmengye/few-shot-ssl-public
to download the mini-imagenet dataset.## Performance
Below are the following attempts to reproduce the results in the reference
paper:### Omniglot:
| Model | 1-shot (5-way Acc.) | 5-shot (5-way Acc.) | 1 -shot (20-way Acc.) | 5-shot (20-way Acc.)|
| --- | --- | --- | --- | --- |
| Reference Paper | 99.07% | 99.78% | 97.64% | 99.36%|
| This repo | 98.31%\* | 99.26%\*\* | 93.75%° | 97.88%°° |\* achieved running `python train.py --exp omniglot_5way_1shot --cuda`
\* achieved running `python train.py --exp omniglot_5way_5shot --num_samples 5 --cuda`
\* achieved running `python train.py --exp omniglot_20way_1shot --num_cls 20 --cuda`
\* achieved running `python train.py --exp omniglot_20way_5shot --num_cls 20
--num_samples 5 --cuda`### Mini-Imagenet:
In progress. Writing the code for the experiments should be done soon but the
main bottleneck in these experiments for me is compute, if someone would be
willing to run and report numbers that would be much appreciated.### RL:
In progress.