https://github.com/SleepyLin/TASR
TASR: Timestep-Aware Diffusion Model for Image Super-Resolution
https://github.com/SleepyLin/TASR
Last synced: 23 days ago
JSON representation
TASR: Timestep-Aware Diffusion Model for Image Super-Resolution
- Host: GitHub
- URL: https://github.com/SleepyLin/TASR
- Owner: SleepyLin
- License: apache-2.0
- Created: 2024-12-03T11:09:20.000Z (11 months ago)
- Default Branch: main
- Last Pushed: 2025-02-17T03:36:52.000Z (9 months ago)
- Last Synced: 2025-02-17T04:27:06.873Z (9 months ago)
- Language: Python
- Size: 1.95 KB
- Stars: 9
- Watchers: 4
- Forks: 0
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-diffusion-categorized - [Code
README
## TASR: Timestep-Aware Diffusion Model for Image Super-Resolution
[Paper](https://arxiv.org/abs/2412.03355)
```shell
# clone this repo
git clone https://github.com/SleepyLin/TASR.git
cd TASR
# create environment
conda create -n tasr python=3.10
conda activate tasr
pip install -r requirements.txt
```
## Inference
1. Download pretrained [Stable Diffusion v2.1](https://huggingface.co/stabilityai/stable-diffusion-2-1-base) and [BSRNet](https://github.com/cszn/KAIR/releases/download/v1.0/BSRNet.pth)
2. Download pretrained [TASR v1 Model](https://huggingface.co/SleepyLin/TASR/blob/main/tasr_v1.pth)
3. Set correct model path in shell script
```shell
sh scripts/inference.sh
```
## Train
todo
Set ./config/train_tast.yaml
```shell
sh scripts/train.sh
```
## Citation
Please cite us if our work is useful for your research.
```
@misc{lin2024tasrtimestepawarediffusionmodel,
title={TASR: Timestep-Aware Diffusion Model for Image Super-Resolution},
author={Qinwei Lin and Xiaopeng Sun and Yu Gao and Yujie Zhong and Dengjie Li and Zheng Zhao and Haoqian Wang},
year={2024},
eprint={2412.03355},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2412.03355},
}
```
## Thanks
This project is based on [DiffBIR](https://github.com/XPixelGroup/DiffBIR). Thanks for their awesome work.