https://github.com/idsia/kohonen-vae
Official repository for the paper "Topological Neural Discrete Representation Learning à la Kohonen" (ICML 2023 Workshop on Sampling and Optimization in Discrete Space)
https://github.com/idsia/kohonen-vae
kohonen-map pytorch self-organizing-map vector-quantization vq-vae
Last synced: 9 days ago
JSON representation
Official repository for the paper "Topological Neural Discrete Representation Learning à la Kohonen" (ICML 2023 Workshop on Sampling and Optimization in Discrete Space)
- Host: GitHub
- URL: https://github.com/idsia/kohonen-vae
- Owner: IDSIA
- Created: 2023-02-13T15:22:10.000Z (about 2 years ago)
- Default Branch: master
- Last Pushed: 2023-04-27T06:29:42.000Z (about 2 years ago)
- Last Synced: 2025-04-05T09:23:03.202Z (about 1 month ago)
- Topics: kohonen-map, pytorch, self-organizing-map, vector-quantization, vq-vae
- Language: Python
- Homepage:
- Size: 77.1 KB
- Stars: 9
- Watchers: 5
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Official source code for our paper "Topological Neural Discrete Representation Learning à la Kohonen"
## TLDR
If you want to reuse the KSOM layer, look at `layers/som_vector_quantizer.py`. It has no external dependencies and supports multi-GPU training.
An example:
```python
from layers.som_vector_quantizer import SOMGeometry, Grid, HardSOM, HardNeighborhoodgeometry = SOMGeometry(
Grid(2),
HardNeighborhood(0.1)
)quantizer = HardSOM(128, 512, 0.99, geometry)
loss, output, perplexity, _ = quantizer(input)
```You can also take a look at `example.py`.
## Installation
This project requires Python 3 and PyTorch 1.8.
```bash
pip3 install -r requirements.txt
```Create a Weights and Biases account and run
```bash
wandb login
```More information on setting up Weights and Biases can be found on
https://docs.wandb.com/quickstart.For plotting, LaTeX is required (to avoid Type 3 fonts and to render symbols). Installation is OS specific.
## Usage
The code makes use of Weights and Biases for experiment tracking. In the "sweeps" directory, we provide sweep configurations for all experiments we have performed. The sweeps are officially meant for hyperparameter optimization, but we use them to run 10 instances of each experiment.
To reproduce our results, start a sweep for each of the YAML files in the "sweeps" directory. Run wandb agent for each of them in the main directory. This will run all the experiments, and they will be displayed on the W&B dashboard.
### Re-creating plots from the paper
Edit config file "paper/config.json". Enter your project name in the field "wandb_project" (e.g. "username/modules").
Run the script of interest within the "paper" directory. For example:
```bash
cd paper/kohonen
python3 compare_init.py
```