Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/FrozenBurning/Relighting4D
[ECCV 2022] Relighting4D: Neural Relightable Human from Videos
https://github.com/FrozenBurning/Relighting4D
eccv2022 illumination nerf neural-rendering reflectance relighting view-synthesis
Last synced: 3 months ago
JSON representation
[ECCV 2022] Relighting4D: Neural Relightable Human from Videos
- Host: GitHub
- URL: https://github.com/FrozenBurning/Relighting4D
- Owner: FrozenBurning
- License: other
- Created: 2022-07-10T12:34:58.000Z (over 2 years ago)
- Default Branch: master
- Last Pushed: 2023-02-16T02:23:32.000Z (over 1 year ago)
- Last Synced: 2024-04-05T09:35:02.048Z (7 months ago)
- Topics: eccv2022, illumination, nerf, neural-rendering, reflectance, relighting, view-synthesis
- Language: Python
- Homepage: https://frozenburning.github.io/projects/relighting4d/
- Size: 218 KB
- Stars: 251
- Watchers: 6
- Forks: 19
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
Relighting4D: Neural Relightable Human from Videos
S-Lab, Nanyang Technological University### [Project Page](https://frozenburning.github.io/projects/relighting4d) | [Video](https://youtu.be/NayAw89qtsY) | [Paper](https://arxiv.org/abs/2207.07104)
## Updates
[08/2022] Model weights released. [![Google Drive](https://img.shields.io/badge/Google%20Drive-4285F4?style=for-the-badge&logo=googledrive&logoColor=yellow)](https://drive.google.com/drive/folders/14pvUxVNCrKEFYjy3h2nc_i1Y7rrcKq2q?usp=sharing)
[07/2022] Paper uploaded to arXiv. [![arXiv](https://img.shields.io/badge/arXiv-2207.07104-b31b1b.svg)](https://arxiv.org/abs/2207.07104)
[07/2022] Code released.
## Citation
If you find our work useful for your research, please consider citing this paper:
```
@inproceedings{chen2022relighting,
title={Relighting4D: Neural Relightable Human from Videos},
author={Zhaoxi Chen and Ziwei Liu},
booktitle={ECCV},
year={2022}
}
```## Installation
We recommend using [Anaconda](https://www.anaconda.com/) to manage your python environment. You can setup the required environment by the following command:
```(bash)
conda env create -f environment.yml
conda activate relighting4d
```## Datasets
### People-Snapshot
We follow [NeuralBody](https://github.com/zju3dv/neuralbody) for data preparation.
1. Download the People-Snapshot dataset [here](https://graphics.tu-bs.de/people-snapshot).
2. Process the People-Snapshot dataset using the [script](tools/process_snapshot.py).
3. Create a soft link:```(bash)
cd /path/to/Relighting4D
mkdir -p data
cd data
ln -s /path/to/people_snapshot people_snapshot
```### ZJU-MoCap
Please refer to [here](https://github.com/zju3dv/neuralbody/blob/master/INSTALL.md) for requesting the download link. Once downloaded, don't forget to add a soft link:
```(bash)
cd /path/to/Relighting4D
mkdir -p data
cd data
ln -s /path/to/zju_mocap zju_mocap
```## Training
We first reconstruct an auxiliary density field in Stage I and then train the whole pipeline in Stage II. All trainings are done on a Tesla V100 GPU with 16GB memory.Take the training on `female-3-casual` as an example.
* Stage I:
```(bash)
python train_net.py --cfg_file configs/snapshot_exp/snapshot_f3c.yaml exp_name female3c resume False gpus "0,"
```
The model weights will be saved to `/data/trained_model/if_nerf/female3c/latest.pth`.* Stage II:
```(bash)
python train_net.py --cfg_file configs/snapshot_exp/snapshot_f3c.yaml task relighte2e exp_name female3c_relight train_relight True resume False train_relight_cfg.smpl_model_ckpt ./data/trained_model/if_nerf/female3c/latest.pth gpus "0,"
```
The final model will be saved to `/data/trained_model/relighte2e/female3c_relight/latest.pth`.* Tensorboard:
```
tensorboard --logdir data/record/if_nerf
tensorboard --logdir data/record/relighte2e
```## Rendering
To relight a human performer from the trained video, our model requires an HDR environment map as input. We provide 8 HDR maps at [light-probes](light-probes/). You can also use your own HDRIs or download some samples from [Poly Haven](https://polyhaven.com/hdris).You are welcome to download our checkpoints from [Google Drive](https://drive.google.com/drive/folders/14pvUxVNCrKEFYjy3h2nc_i1Y7rrcKq2q?usp=sharing).
Here, we take the rendering on `female-3-casual` as an example.
* Relight with novel views of single frame
```(bash)
python run.py --type relight --cfg_file configs/snapshot_exp/snapshot_f3c.yaml exp_name female3c_relight task relighte2e vis_relight True ratio 0.5 gpus "0,"
```* Relight the dynamic humans in video frames
```(bash)
python run.py --type relight_npose --cfg_file configs/snapshot_exp/snapshot_f3c.yaml exp_name female3c_relight task relighte2e vis_relight True vis_relight_npose True ratio 0.5 pyramid False gpus "0,"
```
The results of rendering are located at `/data/render/`. For example, rendering results with [courtyard HDR environment](light-probes/courtyard.hdr) are shown as follows:
## Acknowledgements
This work is supported by the National Research Foundation, Singapore under its AI Singapore Programme, NTU NAP, MOE AcRF Tier 2 (T2EP20221-0033), and under the RIE2020 Industry Alignment Fund - Industry Collaboration Projects (IAF-ICP) Funding Initiative, as well as cash and in-kind contribution from the industry partner(s).Relighting4D is implemented on top of the [NeuralBody](https://github.com/zju3dv/neuralbody) codebase.