https://github.com/vita-group/ins
[ECCV2022]"Unified Implicit Neural Stylization" which proposes a unified stylization framework for SIREN, SDF and NeRF
https://github.com/vita-group/ins
3d-reconstruction nerf neural-radiance-fields pytorch-implementation style-transfer
Last synced: 6 months ago
JSON representation
[ECCV2022]"Unified Implicit Neural Stylization" which proposes a unified stylization framework for SIREN, SDF and NeRF
- Host: GitHub
- URL: https://github.com/vita-group/ins
- Owner: VITA-Group
- Created: 2022-04-04T05:12:57.000Z (over 3 years ago)
- Default Branch: master
- Last Pushed: 2022-11-03T01:38:35.000Z (almost 3 years ago)
- Last Synced: 2025-03-29T09:51:10.330Z (7 months ago)
- Topics: 3d-reconstruction, nerf, neural-radiance-fields, pytorch-implementation, style-transfer
- Language: Python
- Homepage:
- Size: 66.9 MB
- Stars: 109
- Watchers: 12
- Forks: 6
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Unified Implicit Neural Stylization
[](https://opensource.org/licenses/MIT)
[[Paper]](https://arxiv.org/abs/2204.01943) [[Website]](https://zhiwenfan.github.io/INS/)![]()
![]()
![]()
![]()
![]()
## Installation
We recommend users to use `conda` to install the running environment. The following dependencies are required:
```
pytorch=1.7.0
torchvision=0.8.0
cudatoolkit=11.0
tensorboard=2.7.0
opencv
imageio
imageio-ffmpeg
configargparse
scipy
matplotlib
tqdm
mrc
lpips
```## Data Preparation
To run our code on NeRF dataset, users need first download data from official [cloud drive](https://drive.google.com/drive/folders/128yBriW1IG_3NJ5Rp7APSTZsJqdJdfc1). Then extract package files according to the following directory structure:
```
├── configs
│ ├── ...
│
├── datasets
│ ├── nerf_llff_data
│ │ └── room
│ │ └── horns # downloaded llff dataset
| | └── ...
| ├── nerf_synthetic
| | └── lego
| | └── chair # downloaded synthetic dataset
| | └── ...
```
The last step is to generate and process data via our provided script:
```
python gen_dataset.py --config
```
where `` is the path to the configuration file of your experiment instance. Examples and pre-defined configuration files are provided in `configs` folder.### Download Prepared Data:
We provide a data sample for scene "room" in the [Google_Drive](https://drive.google.com/file/d/1W0jtIAu0el04awnSNp712gHZTQJ9nGiU/view?usp=sharing), you can direct download it without any modification.## Testing
After generating datasets, users can test the conditional style interpolation of INS+NeRF by the following command:
```
bash scripts/linear_eval.sh
```
Inference on scene-horns with style-gris1:
```
bash scripts/infer_horns.sh
```## Training
One can do training using:
```
bash scripts/train_room_thescream_28G_mem.sh
```## Stylizing Textured SDF
We also provide code and scripts to stylize textured signed distance functions based on [Implicit Differentiable Renderer (IDR)](https://arxiv.org/abs/2003.09852).
To prepare data, run scripts `data/download_data.sh`, which will download the DTU dataset into the `datasets/` directory. Then follow the [instructions in IDR official repository](https://github.com/lioryariv/idr#installation-requirmenets) to set up the running environment.
Afterwards, train an IDR for a scanned data in DTU where the available IDs are listed in `datasets/DTU`:
```
python run_idr.py --conf ./configs/idr_fixed_cameras.conf --scan_id
```Finally, one can stylize an IDR model with a style image specified in the configuration file:
```
python run_idr.py --conf --scan_id --is_continue
```
in which we defined two preset configurations `configs/idr_stylize_face.conf` and `configs/idr_stylize_scream.conf`.## Citation
If you find this repo is helpful, please cite:
```
@inproceedings{fan2022unified,
title={Unified Implicit Neural Stylization},
author={Fan, Zhiwen and Jiang, Yifan and Wang, Peihao and Gong, Xinyu and Xu, Dejia and Wang, Zhangyang},
booktitle={European Conference on Computer Vision},
year={2022}
}
```