Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/tcapelle/cloud_diffusion
Diffusion on the Clouds: Short-term solar energy forecasting with Diffusion Models
https://github.com/tcapelle/cloud_diffusion
Last synced: 3 days ago
JSON representation
Diffusion on the Clouds: Short-term solar energy forecasting with Diffusion Models
- Host: GitHub
- URL: https://github.com/tcapelle/cloud_diffusion
- Owner: tcapelle
- Created: 2023-03-10T11:37:45.000Z (over 1 year ago)
- Default Branch: master
- Last Pushed: 2024-03-21T20:24:13.000Z (8 months ago)
- Last Synced: 2024-10-01T12:37:41.817Z (about 1 month ago)
- Language: Python
- Size: 12.6 MB
- Stars: 37
- Watchers: 2
- Forks: 10
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
[![](https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-gradient.svg)](https://wandb.ai/capecape/ddpm_clouds/reports/Diffusion-on-the-Clouds-Short-term-solar-energy-forecasting-with-Diffusion-Models--VmlldzozNDMxNTg5)
[![PyPI version](https://badge.fury.io/py/cloud_diffusion.svg)](https://badge.fury.io/py/cloud_diffusion)# Cloud Diffusion Experiment
You can check our GTC presentation on YouTube:
[![](assets/front.jpg)](https://www.youtube.com/watch?v=L5h9kbMMzZs)Samples and training logs for the model generations can be found [here](https://wandb.me/gtc2023).
This codebase contains an implementation of a deep diffusion model applied to cloud images. It was developed as part of a research project exploring the potential of diffusion models for image generation and forecasting.
## Setup
1. Clone this repository and run `pip install -e .` or `pip install cloud_diffusion`
2. Set up your WandB account by signing up at [wandb.ai](https://wandb.ai/site).
3. Set up your WandB API key by running `wandb login` and following the prompts.## Usage
To train the model, run `python train.py`. You can play with the parameters on top of the file to change the model architecture, training parameters, etc.
You can also override the configuration parameters by passing them as command-line arguments, e.g.
```bash
> python train.py --epochs=10 --batch_size=32
```## Training a Simple Diffusion Model
This training is based on a Transformer based Unet (UViT), you can train the default model by running:
```bash
> python train_uvit.py
```## Running Inference
If you are only interested on using the trained models, you can run inference by running:```bash
> python inference.py --future_frames 10 --num_random_experiments 2
```This will generate 10 future frames for 2 random experiments.
## License
This code is released under the [MIT License](LICENSE).