Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/asperti/BalancingVAE
An implementation of Variational Autoencoders with a constant balance between reconstruction error and Kullback-leibler divergence
https://github.com/asperti/BalancingVAE
Last synced: 3 months ago
JSON representation
An implementation of Variational Autoencoders with a constant balance between reconstruction error and Kullback-leibler divergence
- Host: GitHub
- URL: https://github.com/asperti/BalancingVAE
- Owner: asperti
- Created: 2020-01-30T19:06:53.000Z (almost 5 years ago)
- Default Branch: master
- Last Pushed: 2020-05-23T07:03:34.000Z (over 4 years ago)
- Last Synced: 2024-07-04T02:13:01.925Z (4 months ago)
- Language: Python
- Size: 4.75 MB
- Stars: 19
- Watchers: 2
- Forks: 2
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
This repository contains an implementation of Variational Autoencoders with a new balancing strategy between reconstrution error and KL-divergence in the loss function.
Specifically, we enforce a constant balance between these two components via normalization of the reconstruction error by an estimation of its current value, derived from minibatches.
We derived this technique by an investigation of the loss function used by Dai e Wipf for their Two-Stage VAE, where the balancing parameter was instead learned during training.
Our technique seems to outperform all previous Variational Approaches, permitting us to obtain unprecedented FID scores for traditional datasets such as CIFAR-10 and CelebA.
The code is largely based on Dai e Wipf code at
https://github.com/daib13/TwoStageVAEThe code for computing fid is a minor adaptation of the code at
https://github.com/tsc2017/Frechet-Inception-Distance