https://github.com/ondrejbiza/vae
Variational autoencoders implemented in Tensorflow.
https://github.com/ondrejbiza/vae
deep-learning deep-neural-networks tensorflow variational-autoencoders vq-vae
Last synced: 8 months ago
JSON representation
Variational autoencoders implemented in Tensorflow.
- Host: GitHub
- URL: https://github.com/ondrejbiza/vae
- Owner: ondrejbiza
- License: mit
- Created: 2019-01-28T14:30:40.000Z (over 6 years ago)
- Default Branch: master
- Last Pushed: 2019-10-18T18:45:04.000Z (almost 6 years ago)
- Last Synced: 2025-01-08T11:41:46.969Z (9 months ago)
- Topics: deep-learning, deep-neural-networks, tensorflow, variational-autoencoders, vq-vae
- Language: Python
- Size: 884 KB
- Stars: 1
- Watchers: 3
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Variational Autoencoders in Tensorflow
![]()
![]()
## Set up
* Install Python >= 3.6.
* Install packages in *requirements.txt*.
* Tested with tensorflow-gpu 1.7.0 (CUDA 9.1, cuDNN 7.1) and tensorflow-gpu 1.14.0 (CUDA 10.0, cuDNN 7.6).
* For tensorflow-gpu 1.14.0, use the flag --fix-cudnn if you get a cuDNN initialization error.
## Usage### Autoencoder:
```
# ConvNet on MNIST
python -m vae.scripts.ae_conv_mnist
```MNIST, default settings: -54.26 test log-likelihood (1 run)
### Variational Autoencoder (VAE):
```
# ConvNet on MNIST
python -m vae.scripts.vae_conv_mnist# fully-connected net on MNIST
python -m vae.scripts.vae_fc_mnist
```Paper: https://arxiv.org/abs/1312.6114
MNIST, ConvNet, default settings: -71.52 test log-likelihood (1 run)
### VampPrior VAE:
```
# ConvNet on MNIST
python -m vae.scripts.vampprior_vae_conv_mnist# fully-connected net on a toy dataset
python -m vae.scripts.vampprior_vae_fc_toy
```Paper: https://arxiv.org/abs/1705.07120
MNIST, default settings: -70.08 test log-likelihood (1 run)
### Gaussian Mixture Prior VAE:
```
# ConvNet on MNIST
python -m vae.scripts.gmprior_vae_conv_mnist# fully-connected net on a toy dataset
python -m vae.scripts.gmprior_vae_fc_toy
```Baseline from https://arxiv.org/abs/1705.07120
MNIST, ConvNet, default settings: -69.58 test log-likelihood (1 run)
### Softmax-Gumbel VAE:
```
# ConvNet on MNIST
python -m vae.scripts.sg_vae_conv_mnist
```Paper: https://arxiv.org/abs/1611.01144
MNIST, default settings: -81.56 test log-likelihood (1 run)
### Vector Quantization VAE (VQ-VAE):
More or less a 1-on-1 copy of https://github.com/hiwonjoon/tf-vqvae/blob/master/model.py:
```
python -m vae.scripts.vq_vae_fully_conv_mnist
```My own version that seems to produce better samples:
```
python -m vae.scripts.vq_vae_conv_mnist
```Paper: https://arxiv.org/abs/1711.00937
I'm not sure how to measure the test log-likelihood here.
## Notes
* The architecture of all ConvNets is based on this paper (https://arxiv.org/abs/1803.10122) with half the filters.