https://github.com/abdulfatir/iwae-tensorflow
Tensorflow implementation of Importance Weighted Auto Encoder
https://github.com/abdulfatir/iwae-tensorflow
autoencoder deep-learning generative-model paper-implementations variational-autoencoder
Last synced: 5 months ago
JSON representation
Tensorflow implementation of Importance Weighted Auto Encoder
- Host: GitHub
- URL: https://github.com/abdulfatir/iwae-tensorflow
- Owner: abdulfatir
- Created: 2018-09-16T09:58:16.000Z (about 7 years ago)
- Default Branch: master
- Last Pushed: 2018-09-21T12:44:11.000Z (about 7 years ago)
- Last Synced: 2025-04-01T03:35:13.962Z (6 months ago)
- Topics: autoencoder, deep-learning, generative-model, paper-implementations, variational-autoencoder
- Language: Python
- Size: 127 KB
- Stars: 5
- Watchers: 2
- Forks: 3
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Importance Weighted Auto Encoder
A tensorflow implementation of _Importance Weighted Auto Encoder_ [1]## Requirements
* tensorflow
* numpy
* matplotlib## Usage
```
python main.py --dataset {mnist,omniglot} \
--k <# of particles for training> \
--test_k <# number of particles for testing> \
--n_steps <# of steps> \
--batch_size
```### Datasets
* MNIST - automatically downloaded by tensorflow
* OMNIGLOT - run `download_omniglot.sh`## Results
The following are the log-likelihood values after training for 400,000 steps with a batch size of 100 for different number of particles (`k`) and `test_k = 5000`.|k| NLL (MNIST) | NLL (OMNIGLOT) |
|:----:|:----:|:----:|
| 1 | 90.26 | 114.68 |
| 5 | 88.49 | 112.25 |
| 50 | 87.34 | 110.31 |### References
[1] Burda, Y., Grosse, R. and Salakhutdinov, R., 2015. Importance Weighted Autoencoders. arXiv preprint arXiv:1509.00519.