https://github.com/LowLevelAI/GLARE
Official implementation of GLARE, which is accpeted by ECCV 2024.
https://github.com/LowLevelAI/GLARE
codebook eccv24 low-light-image-enhancement normalizing-flow
Last synced: 3 months ago
JSON representation
Official implementation of GLARE, which is accpeted by ECCV 2024.
- Host: GitHub
- URL: https://github.com/LowLevelAI/GLARE
- Owner: LowLevelAI
- License: apache-2.0
- Created: 2024-07-08T04:55:13.000Z (11 months ago)
- Default Branch: main
- Last Pushed: 2024-12-31T11:25:01.000Z (5 months ago)
- Last Synced: 2024-12-31T12:25:52.838Z (5 months ago)
- Topics: codebook, eccv24, low-light-image-enhancement, normalizing-flow
- Language: Python
- Homepage: https://arxiv.org/pdf/2407.12431
- Size: 3.4 MB
- Stars: 83
- Watchers: 5
- Forks: 3
- Open Issues: 9
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE.txt
Awesome Lists containing this project
README
# [ECCV 2024] GLARE: Low Light Image Enhancement via Generative Latent Feature based Codebook Retrieval [[Paper]](https://arxiv.org/pdf/2407.12431)
Han Zhou1,*, Wei Dong1,*, Xiaohong Liu2,†, Shuaicheng Liu3, Xiongkuo Min2, Guangtao Zhai2, Jun Chen1,†
1McMaster University, 2Shanghai Jiao Tong University,
3University of Electronic Science and Technology of China
*Equal Contribution, †Corresponding Authors
[](https://paperswithcode.com/sota/low-light-image-enhancement-on-lolv2?p=glare-low-light-image-enhancement-via)
[](https://paperswithcode.com/sota/low-light-image-enhancement-on-lolv2-1?p=glare-low-light-image-enhancement-via)
[](https://paperswithcode.com/sota/low-light-image-enhancement-on-lol?p=glare-low-light-image-enhancement-via)### Introduction
This repository represents the official implementation of our ECCV 2024 paper titled **GLARE: Low Light Image Enhancement via Generative Latent Feature based Codebook Retrieval**. If you find this repo useful, please give it a star ⭐ and consider citing our paper in your research. Thank you.[](https://www.apache.org/licenses/LICENSE-2.0)
We present GLARE, a novel network for low-light image enhancement.
- **Codebook-based LLIE**: exploit normal-light (NL) images to extract NL codebook prior as the guidance.
- **Generative Feature Learning**: develop an invertible latent normalizing flow strategy for feature alignment.
- **Adaptive Feature Transformation**: adaptively introduces input information into the decoding process and allows flexible adjustments for users.
- **Future:** network structure can be meticulously optimized to improve efficiency and performance in the future.### Overall Framework
## 📢 News
**2024-12-31:** Train code has been released. ⭐
**2024-09-25:** Our another paper **ECMamba: Consolidating Selective State Space Model with Retinex Guidance for Efficient Multiple Exposure Correction** has been accepted by NeurIPS 2024. Code and pre-print will be released at:. :rocket:
**2024-09-21:** Inference code for unpaired images and pre-trained models for LOL-v2-real is released! :rocket:
**2024-07-21:** Inference code and pre-trained models for LOL is released! Feel free to use them. ⭐
**2024-07-21:** [License](LICENSE.txt) is updated to Apache License, Version 2.0. 💫
**2024-07-19:** Paper is available at:. :tada:
**2024-07-01:** Our paper has been accepted by ECCV 2024. Code and Models will be released. :rocket:## 🛠️ Setup
The inference code was tested on:
- Ubuntu 22.04 LTS, Python 3.8, CUDA 11.3, GeForce RTX 2080Ti or higher GPU RAM.
### 📦 Repository
Clone the repository (requires git):
```bash
git clone https://github.com/LowLevelAI/GLARE.git
cd GLARE
```### 💻 Dependencies
- **Make Conda Environment: Using [Conda](https://conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html) to create the environment:**
```bash
conda create -n glare python=3.8
conda activate glare
```
- **Then install dependencies:**```bash
conda install pytorch=1.11 torchvision cudatoolkit=11.3 -c pytorch
pip install addict future lmdb numpy opencv-python Pillow pyyaml requests scikit-image scipy tqdm yapf einops tb-nightly natsort
pip install pyiqa==0.1.4
pip install pytorch_lightning==1.6.0
pip install --force-reinstall charset-normalizer==3.1.0
```- **Build CUDA extensions:**
```bash
cd GLARE/defor_cuda_ext
BASICSR_EXT=True python setup.py develop
```- **Move CUDA extensions** (/GLARE/defor_cuda_ext/basicsr/ops/dcn/deform_conv_ext.xxxxxx.so) to the path: **/GLARE/code/models/modules/ops/dcn/**.
## 🏃 Testing on benchmark datasets
### 📷 Download following datasets:
LOL [Google Drive](https://drive.google.com/file/d/1L-kqSQyrmMueBh_ziWoPFhfsAh50h20H/view?usp=sharing)
LOL-v2 [Google Drive](https://drive.google.com/file/d/1Ou9EljYZW8o5dbDCf9R34FS8Pd8kEp2U/view?usp=sharing)
### ⬇ Download pre-trained models
Download [pre-trained weights for LOL](https://drive.google.com/drive/folders/1DuATvqpNgRGlPq5_LvvzghkFdFL9sYvq), [pre-trained weights for LOL-v2-real](https://drive.google.com/drive/folders/1Cesn3jJAdxjT7DDZCTMU8Vt2CnauBL7F?usp=drive_link) and place them to folder `pretrained_weights_lol`, `pretrained_weights_lol-v2-real`, respectively.
### 🚀 Run inference
For LOL dataset
```bash
python code/infer_dataset_lol.py
```For LOL-v2-real dataset
```bash
python code/infer_dataset_lolv2-real.py
```For unpaired testing, please make sure the *dataroot_unpaired* in the .yml file is correct.
```bash
python code/infer_unpaired.py
```You can find all results in `results/`. **Enjoy**!
## 🏋️ Training
Download [VQGAN weight for LOL](https://drive.google.com/drive/folders/1DuATvqpNgRGlPq5_LvvzghkFdFL9sYvq),and place it to folder `pretrained_weights_lol`.
For stage2 training:
```bash
python code/train_stage2.py
```For stage2 testing:
```bash
python code/test_stage2.py
```For stage3 training:
```bash
python code/train_stage3.py
```For stage3 testing:
```bash
python code/test_stage3.py
```## ✏️ Contributing
Please refer to [this](CONTRIBUTING.md) instruction.
## 🎓 Citation
Please cite our paper:
```bibtex
@inproceedings{han2025glare,
title={Glare: Low light image enhancement via generative latent feature based codebook retrieval},
author={Zhou, Han and Dong, Wei and Liu, Xiaohong and Liu, Shuaicheng and Min, Xiongkuo and Zhai, Guangtao and Chen, Jun},
booktitle={European Conference on Computer Vision},
pages={36--54},
year={2025},
organization={Springer}
}@article{GLARE,
title = {GLARE: Low Light Image Enhancement via Generative Latent Feature based Codebook Retrieval},
author = {Zhou, Han and Dong, Wei and Liu, Xiaohong and Liu, Shuaicheng and Min, Xiongkuo and Zhai, Guangtao and Chen, Jun},
journal = {arXiv preprint arXiv:2407.12431},
year = {2024}
}
```## 🎫 License
This work is licensed under the Apache License, Version 2.0 (as defined in the [LICENSE](LICENSE.txt)).
By downloading and using the code and model you agree to the terms in the [LICENSE](LICENSE.txt).
[](https://www.apache.org/licenses/LICENSE-2.0)