https://github.com/vita-group/gan-lth
[ICLR 2021] "GANs Can Play Lottery Too" by Xuxi Chen, Zhenyu Zhang, Yongduo Sui, Tianlong Chen
https://github.com/vita-group/gan-lth
gan generative-adversarial-network lottery-ticket-hypothesis pruning transfer
Last synced: 6 months ago
JSON representation
[ICLR 2021] "GANs Can Play Lottery Too" by Xuxi Chen, Zhenyu Zhang, Yongduo Sui, Tianlong Chen
- Host: GitHub
- URL: https://github.com/vita-group/gan-lth
- Owner: VITA-Group
- License: mit
- Created: 2021-01-13T03:49:52.000Z (almost 5 years ago)
- Default Branch: main
- Last Pushed: 2022-02-18T16:57:32.000Z (over 3 years ago)
- Last Synced: 2025-03-29T09:42:08.153Z (7 months ago)
- Topics: gan, generative-adversarial-network, lottery-ticket-hypothesis, pruning, transfer
- Language: Python
- Homepage:
- Size: 250 KB
- Stars: 26
- Watchers: 9
- Forks: 7
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# GANs Can Play Lottery Tickets Too
[](https://opensource.org/licenses/MIT)
Code for this paper [GANs Can Play Lottery Tickets Too](https://openreview.net/forum?id=1AoMhc_9jER).
## Overview
For a range of GANs, we can find matching subnetworks at 67%-74% sparsity. We observe that with or without pruning discriminator has a minor effect on the existence and quality of matching subnetworks, while the initialization used in the discriminator plays a significant role.
## Experiment Results
Iterative pruning results on SNGAN

## Requirements
`pytorch==1.4.0`
`tensorflow-gpu=1.15.0`
`imageio`
`scikit-image`
`tqdm`
`tensorboardx`
## Command
### SNGAN
#### Generate Initial Weights
```
mkdir initial_weights
python generate_initial_weights.py --model sngan_cifar10
```
#### Prepare FID statistics
Download FID statistics files from [here](https://www.dropbox.com/sh/8xhqxsxnsto18im/AAAkDr-Zf3sgXx1A7RAhlqcva?dl=0) to `fid_stat`.
#### Baseline
```
python train.py --model sngan_cifar10 --exp_name sngan_cifar10 --init-path initial_weights
```
Baseline models are also available [here](https://drive.google.com/drive/folders/1-QSfRrVpHSrHppmEf8fuAUn6Nv-N2z2R?usp=sharing).
#### Iterative Magnitude Pruning on Generator (IMPG)
```
python train_impg.py --model sngan_cifar10 --exp_name sngan_cifar10 --init-path initial_weights --load-path
```
#### Iterative Magnitude Pruning on Generator (IMPGD)
```
python train_impgd.py --model sngan_cifar10 --exp_name sngan_cifar10 --init-path initial_weights
```
### Iterative Magnitude Pruning on Generator (IMPGDKD)
```
python train_impgd.py --model sngan_cifar10 --exp_name sngan_cifar10 --init-path initial_weights --use-kd-d
```
### CycleGAN
#### Generate initial weights
```
mkdir initial_weights
python generate_initial_weights.py
```
#### Download Data
```
./download_dataset DATASET_NAME
```
#### Baseline
```
python train.py --dataset DATASET_NAME --rand initial_weights --gpu GPU
```
#### IMPG
```
python train_impg.py --dataset DATASET_NAME --rand initial_weights --gpu GPU --pretrain PRETRAIN
```
#### IMPGD
```
python train_impg.py --dataset DATASET_NAME --rand initial_weights --gpu GPU --pretrain PRETRAIN
```
## Acknowledgement
Inception Score code from OpenAI's Improved GAN (official), and the FID code and CIFAR-10 statistics file from https://github.com/bioinf-jku/TTUR (official).