Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/nishant2018/wgan-s-wasserstein-gen.ai-

Wasserstein GAN (WGAN) is a variant of the traditional Generative Adversarial Network (GAN) that aims to improve training stability and address issues like mode collapse.
https://github.com/nishant2018/wgan-s-wasserstein-gen.ai-

generative-adversarial-network genrative-ai wasserstein-distance wasserstein-gan wgan

Last synced: 2 days ago
JSON representation

Wasserstein GAN (WGAN) is a variant of the traditional Generative Adversarial Network (GAN) that aims to improve training stability and address issues like mode collapse.

Awesome Lists containing this project

README

        

### Understanding Wasserstein GAN (WGAN)

**Wasserstein GAN (WGAN)** is a variant of the traditional Generative Adversarial Network (GAN) that aims to improve training stability and address issues like mode collapse. The key differences and concepts are:

1. **Critic Instead of Discriminator**:
- Traditional GANs use a discriminator to classify inputs as real or fake.
- WGAN replaces the discriminator with a critic that scores the "realness" of inputs without using a sigmoid activation function.

2. **Wasserstein Distance**:
- WGAN minimizes the Wasserstein distance (also known as Earth Mover's distance) between the real and generated data distributions.
- This distance provides a smoother and more meaningful measure of differences between distributions compared to the Jensen-Shannon divergence used in traditional GANs.

3. **Weight Clipping**:
- To enforce the Lipschitz constraint required for the Wasserstein distance, the weights of the critic are clipped to a small fixed range (e.g., [-0.01, 0.01]).
- This ensures that the critic function is 1-Lipschitz continuous.

4. **Training Stability**:
- WGANs improve training stability and provide a more reliable training process.
- They address common issues like mode collapse, where the generator produces limited diversity in generated samples.

5. **Gradient Penalty**:
- An enhancement called WGAN-GP (WGAN with Gradient Penalty) replaces weight clipping with a gradient penalty term.
- This term penalizes the norm of the gradient of the critic with respect to its input, ensuring the Lipschitz constraint more effectively.

### Advantages of WGAN:
- **Improved Training Dynamics**: More stable and reliable training process compared to traditional GANs.
- **Better Loss Metric**: The loss function correlates with the quality of generated samples, providing a meaningful metric during training.
- **Reduced Mode Collapse**: More diverse and realistic outputs from the generator.

### Summary:
WGAN offers a robust approach to training GANs by introducing the Wasserstein distance, replacing the discriminator with a critic, and employing techniques like weight clipping or gradient penalty to enforce Lipschitz continuity. These improvements result in a more stable training process and higher-quality generated samples.