https://github.com/thiswillbeyourgithub/beta-variational-autoencoder
Simple implementation of a Beta VAE by GPT-4
https://github.com/thiswillbeyourgithub/beta-variational-autoencoder
Last synced: 5 months ago
JSON representation
Simple implementation of a Beta VAE by GPT-4
- Host: GitHub
- URL: https://github.com/thiswillbeyourgithub/beta-variational-autoencoder
- Owner: thiswillbeyourgithub
- License: gpl-3.0
- Created: 2023-12-04T16:19:49.000Z (about 2 years ago)
- Default Branch: main
- Last Pushed: 2025-01-11T16:51:31.000Z (about 1 year ago)
- Last Synced: 2025-10-14T00:38:17.817Z (5 months ago)
- Language: Python
- Size: 344 KB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Beta-Variational-Autoencoder
Simple Beta-VAE using scikit-learn API, made mostly by prompting GPT-4.
With a single argument this can instead be a regular autoencoder (no variational).
The [VeLO optimizer](https://github.com/janEbert/PyTorch-VeLO) can be used (apparently only on cpu and not on cuda?)
I made this because I couldn't find an appropriate implementation in python / pytorch and needed one for another project: [QuestEA](https://github.com/thiswillbeyourgithub/QuestEA)
# Example result on handwritten digits:

## Notes
* There are 2 compression layers and the decompression is symetrical.
* A wrapper called `OptimizedBVAE` can be used to do a grid search over the hidden_dim parameters then return the best model after further training.
* This side quest was done hastily, there might be huge mistakes as well as unoptimized code etc. If you see that, please notify me by creating an issue!
The optimizer used is AdamW, but `VeLO` can be used from [this repo](https://github.com/janEbert/PyTorch-VeLO).
## Usage
```
from bvae import ReducedBVAE
model = ReducedBVAE(
input_dim,
z_dim, # lowest nmuber of dimension
hidden_dim, # number of neurons in the 2nd layer of the compression
dataset_size,
lr=1e-3,
epochs=1000,
beta=1.0,
weight_decay=0.01,
use_VeLO=False,
use_scheduler=True,
)
model.prepare_dataset(
dataset=dataset,
val_ratio=0.2,
batch_size=500,
)
model.train_bvae(
patience=100,
)
projection = model.transform(dataset)
```