Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://ku-cvlab.github.io/DaRF
Official code implementation of "DäRF: Boosting Radiance Fields from Sparse Inputs with Monocular Depth Adaptation"(NeurIPS 2023)
https://ku-cvlab.github.io/DaRF
Last synced: 6 days ago
JSON representation
Official code implementation of "DäRF: Boosting Radiance Fields from Sparse Inputs with Monocular Depth Adaptation"(NeurIPS 2023)
- Host: GitHub
- URL: https://ku-cvlab.github.io/DaRF
- Owner: KU-CVLAB
- License: mit
- Created: 2023-05-26T09:43:20.000Z (over 1 year ago)
- Default Branch: master
- Last Pushed: 2024-02-19T05:43:22.000Z (9 months ago)
- Last Synced: 2024-06-01T23:38:21.353Z (5 months ago)
- Language: Python
- Homepage: https://ku-cvlab.github.io/DaRF/
- Size: 189 MB
- Stars: 63
- Watchers: 3
- Forks: 0
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-scene-representation - DäRF: Boosting Radiance Fields from Sparse Inputs with Monocular Depth Adaptation - CVLAB/DaRF) | [bibtex](./citations/song2023darf.txt) (Uncategorized / Uncategorized)
README
# DäRF: Boosting Radiance Fields from Sparse Inputs with Monocular Depth Adaptation (NeurIPS 2023)
This is official implementation of the paper "DäRF: Boosting Radiance Fields from Sparse Inputs with Monocular Depth Adaptation".
## Introduction
Unlike existing work (SCADE [CVPR'23]) that distills depths by pretrained MDE to NeRF at seen view only, our DäRF fully exploits the ability of MDE by jointly optimizing NeRF and MDE at a specific scene, and distilling the monocular depth prior to NeRF at both seen and unseen views. For more details, please visit our [project page](https://ku-cvlab.github.io/DaRF/)!
## TODO
- [ ] Reveal the pretrained-weight on Scannet
- [ ] TNT/in-the-wild datasets and dataloaders## Installation
An example of installation is shown below:
```
git clone https://github.com/KU-CVLAB/DaRF.git
cd DaRF
conda create -n DaRF python=3.8
conda activate DaRF
pip install -r requirements.txt
pip install git+https://github.com/NVlabs/tiny-cuda-nn/#subdirectory=bindings/torch
```
Also, you need to download pretrained MiDaS 3.0 weights(dpt_hybrid_384) on [here](https://github.com/isl-org/MiDaS).And you should replace the 'dpt_pretrained_weight' part of the config file with the MiDaS pretrained weights path.
## Dataset Download
You can download Scannet Dataset on [here](https://koreaoffice-my.sharepoint.com/:u:/g/personal/seong0905_korea_ac_kr/EQ4QA4rpC5hMib8XVs30pJMBTT7oQXBFkY5lrXjpH3YH3g?e=g49eBd).If you want to download data on a different path, you should replace the 'data_dirs' part of the config file with the donloaded dataset path.
## Training
* 18-20 view Training
```
PYTHONPATH='.' python plenoxels/main.py --config plenoxels/configs/07XX.py
```
* 9-10 view Training
```
PYTHONPATH='.' python plenoxels/main.py --config plenoxels/configs/07XX_few.py
```## Evaluation / Rendering
If you want to Evaluation or Rendering, You need to replace the 'checkpoint' part of the config file with the trained weights path.
* 18-20 view Evalutaion
```
PYTHONPATH='.' python plenoxels/main.py --config plenoxels/configs/07XX.py --validate-only --load_model
```
* 18-20 view Rendering
```
PYTHONPATH='.' python plenoxels/main.py --config plenoxels/configs/07XX.py --render-only --load_model
```
* 9-10 view Evalutaion
```
PYTHONPATH='.' python plenoxels/main.py --config plenoxels/configs/07XX_few.py --validate-only --load_model
```
* 9-10 view Rendering
```
PYTHONPATH='.' python plenoxels/main.py --config plenoxels/configs/07XX_few.py --render-only --load_model
```## Acknowledgements
This code heavily borrows from [K-planes](https://github.com/sarafridov/K-Planes).## Citation
If you use this software package, please cite our paper:
```
@article{song2023d,
title={D$\backslash$" aRF: Boosting Radiance Fields from Sparse Inputs with Monocular Depth Adaptation},
author={Song, Jiuhn and Park, Seonghoon and An, Honggyu and Cho, Seokju and Kwak, Min-Seop and Cho, Sungjin and Kim, Seungryong},
journal={arXiv preprint arXiv:2305.19201},
year={2023}
}
```