https://github.com/le-big-mac/vcl_diffusion
Variational continual learning of a conditional diffusion model to generate MNIST. Based on 'Conditional Diffusion MNIST'.
https://github.com/le-big-mac/vcl_diffusion
bayesian-deep-learning continual-learning deep-learning diffusion-models neural-network
Last synced: 7 months ago
JSON representation
Variational continual learning of a conditional diffusion model to generate MNIST. Based on 'Conditional Diffusion MNIST'.
- Host: GitHub
- URL: https://github.com/le-big-mac/vcl_diffusion
- Owner: le-big-mac
- License: mit
- Created: 2024-04-02T01:18:19.000Z (about 2 years ago)
- Default Branch: main
- Last Pushed: 2024-05-22T14:20:04.000Z (almost 2 years ago)
- Last Synced: 2025-07-23T10:52:24.130Z (9 months ago)
- Topics: bayesian-deep-learning, continual-learning, deep-learning, diffusion-models, neural-network
- Language: Python
- Homepage:
- Size: 46.5 MB
- Stars: 0
- Watchers: 0
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Variational continual learning for diffusion models.
A codespace for training diffusion models with [variational continual learning](https://arxiv.org/abs/1710.10628), to see if it can mitigate catastrophic forgetting.
Standard training.
VCL training.
Images generated during continual learning of MNIST digits, with no data replay. Each row i shows generations from when the model had been trained on
the first i tasks (digits), each column j shows generations for digit j.
Note that the variational inference for VCL is very expensive, so this code will take several hours to run on a powerful GPU.
This diffusion model in this project is based on original work by Tim Pearce, released under the MIT License in 2022. The original code can be found at [Conditional Diffusion MNIST](https://github.com/TeaPearce/Conditional_Diffusion_MNIST). The modifications made by me in 2024 are also released under the MIT License.