https://github.com/sczhou/ignn
[NeurIPS 2020] Cross-Scale Internal Graph Neural Network for Image Super-Resolution
https://github.com/sczhou/ignn
deep-learning image-preprocessing image-restoration super-resolution
Last synced: 6 months ago
JSON representation
[NeurIPS 2020] Cross-Scale Internal Graph Neural Network for Image Super-Resolution
- Host: GitHub
- URL: https://github.com/sczhou/ignn
- Owner: sczhou
- Created: 2020-06-29T16:57:33.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2021-02-23T12:58:33.000Z (over 4 years ago)
- Last Synced: 2025-04-09T21:17:48.790Z (6 months ago)
- Topics: deep-learning, image-preprocessing, image-restoration, super-resolution
- Language: Python
- Homepage:
- Size: 146 KB
- Stars: 313
- Watchers: 18
- Forks: 40
- Open Issues: 11
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# IGNN
Code repo for "Cross-Scale Internal Graph Neural Network for Image Super-Resolution" [[paper]](https://proceedings.neurips.cc/paper/2020/file/23ad3e314e2a2b43b4c720507cec0723-Paper.pdf) [[supp]](https://proceedings.neurips.cc/paper/2020/file/23ad3e314e2a2b43b4c720507cec0723-Supplemental.pdf)
![]()
## Prepare datasets
1 Download training dataset and test datasets from [here](https://drive.google.com/file/d/1fFBCXkUIgHkjqWiCeW7w-1TYHE0A2ZZF/view?usp=sharing).2 Crop training dataset DIV2K to sub-images.
```
python ./datasets/prepare_DIV2K_subimages.py
```
Remember to modify the 'input_folder' and 'save_folder' in the above script.## Dependencies and Installation
The denoising code is tested with Python 3.7, PyTorch 1.1.0 and Cuda 9.0 but is likely to run with newer versions of PyTorch and Cuda.1 Create conda environment.
```
conda create --name ignn
conda activate ignn
conda install pytorch=1.1.0 torchvision=0.3.0 cudatoolkit=9.0 -c pytorch
```
2 Install PyInn.
```
pip install git+https://github.com/szagoruyko/pyinn.git@master
```
3 Install matmul_cuda.
```
bash install.sh
```
4 Install other dependencies.
```
pip install -r requirements.txt
```## Pretrained Models
Downloading the pretrained models from this [link](https://drive.google.com/drive/folders/1xS0jATn0MddZkLl2Rx9VPLh-U_rUxjt1?usp=sharing) and put them into ./ckpt## Training
Use the following command to train the network:```
python runner.py
--gpu [gpu_id]\
--phase 'train'\
--scale [2/3/4]\
--dataroot [dataset root]\
--out [output path]
```
Use the following command to resume training the network:```
python runner.py
--gpu [gpu_id]\
--phase 'resume'\
--weights './ckpt/IGNN_x[2/3/4].pth'\
--scale [2/3/4]\
--dataroot [dataset root]\
--out [output path]
```
You can also use the following simple command with different settings in config.py:```
python runner.py
```## Testing
Use the following command to test the network on benchmark datasets (w/ GT):
```
python runner.py \
--gpu [gpu_id]\
--phase 'test'\
--weights './ckpt/IGNN_x[2/3/4].pth'\
--scale [2/3/4]\
--dataroot [dataset root]\
--testname [Set5, Set14, BSD100, Urban100, Manga109]\
--out [output path]
```Use the following command to test the network on your demo images (w/o GT):
```
python runner.py \
--gpu [gpu_id]\
--phase 'test'\
--weights './ckpt/IGNN_x[2/3/4].pth'\
--scale [2/3/4]\
--demopath [test folder path]\
--testname 'Demo'\
--out [output path]
```You can also use the following simple command with different settings in config.py:
```
python runner.py
```## Visual Results (x4)
For visual comparison on the 5 benchmarks, you can download our IGNN results from [here](https://drive.google.com/file/d/15x81tYQVpml4OvFqbA05mQQSRKL8phxz/view?usp=sharing).### Some examples


## Citation
If you find our work useful for your research, please consider citing the following papers :)
```
@inproceedings{zhou2020cross,
title={Cross-scale internal graph neural network for image super-resolution},
author={Zhou, Shangchen and Zhang, Jiawei and Zuo, Wangmeng and Loy, Chen Change},
booktitle={Advances in Neural Information Processing Systems},
year={2020}
}
```
## ContactWe are glad to hear from you. If you have any questions, please feel free to contact shangchenzhou@gmail.com.
## License
This project is open sourced under MIT license.