https://github.com/Luciennnnnnn/liif-lightning-hydra
An reimplement of liif(Learning Continuous Image Representation with Local Implicit Image Function) using lightning+hydra
https://github.com/Luciennnnnnn/liif-lightning-hydra
deep-learning hydra implicit-neural-representation machine-learning pytorch-lightning super-resolution
Last synced: 6 months ago
JSON representation
An reimplement of liif(Learning Continuous Image Representation with Local Implicit Image Function) using lightning+hydra
- Host: GitHub
- URL: https://github.com/Luciennnnnnn/liif-lightning-hydra
- Owner: Luciennnnnnn
- License: mit
- Created: 2021-03-17T03:00:05.000Z (over 4 years ago)
- Default Branch: main
- Last Pushed: 2021-03-26T11:04:46.000Z (over 4 years ago)
- Last Synced: 2024-11-14T09:07:29.566Z (11 months ago)
- Topics: deep-learning, hydra, implicit-neural-representation, machine-learning, pytorch-lightning, super-resolution
- Language: Python
- Homepage:
- Size: 69.3 KB
- Stars: 10
- Watchers: 1
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# LIIF with Lightning + Hydra
[](https://github.com/hobogalaxy/lightning-hydra-template)
## Description
An reimplement of [Learning Continuous Image Representation with Local Implicit Image Function
](https://arxiv.org/abs/2012.09161) using lightning and hydra based on [this](https://github.com/hobogalaxy/lightning-hydra-template) awesome template.
## How to run
Install dependencies
```yaml
# clone project
git clone git@github.com:LuoXin-s/liif-lightning-hydra.git
cd liif-lightning-hydra
# [OPTIONAL] create conda environment
conda env create -f conda_env_gpu.yaml -n your_env_name
conda activate your_env_name
# install requirements
pip install -r requirements.txt
```
Train model with default configuration (train on DIV2K with visible GPUs)
```yaml
python run.py
# specify used GPUs
python run.py trainer.gpus=[0, 2, 5]
# use cpu
python run.py trainer.gpus=0
```
Train model with chosen experiment configuration
```yaml
# experiment configurations are placed in folder `configs/experiment/`
python run.py +experiment=exp_example_simple
# train on CelebAHQ
python run.py +experiment=train_on_CelebAHQ_32_256
python run.py +experiment=train_on_CelebAHQ_64_128
```
Test model
```yaml
# test pretrained model on DIV2K
python run.py train=false
```
You can override any parameter from command line like this
```yaml
python run.py trainer.max_epochs=20 optimizer.lr=0.0005
```