Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/ChikaYan/d2nerf
https://github.com/ChikaYan/d2nerf
Last synced: about 22 hours ago
JSON representation
- Host: GitHub
- URL: https://github.com/ChikaYan/d2nerf
- Owner: ChikaYan
- License: apache-2.0
- Created: 2022-05-20T15:55:56.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2023-08-12T08:45:58.000Z (about 1 year ago)
- Last Synced: 2024-03-06T12:55:32.820Z (8 months ago)
- Language: Jupyter Notebook
- Size: 1.79 MB
- Stars: 178
- Watchers: 7
- Forks: 14
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
Awesome Lists containing this project
- awesome-NeRF - TF - Supervised Decoupling of Dynamic and Static Objects from a Monocular Video](https://arxiv.org/pdf/2205.15838.pdf)|[Project Page](https://d2nerf.github.io/)| (Papers / NeRF)
- awesome-NeRF - TF - Supervised Decoupling of Dynamic and Static Objects from a Monocular Video](https://arxiv.org/pdf/2205.15838.pdf)|[Project Page](https://d2nerf.github.io/)| (Papers / NeRF)
README
# D2NeRF: Self-Supervised Decoupling of Dynamic and Static Objects from a Monocular Video
This is the code for "D2NeRF: Self-Supervised Decoupling of Dynamic and Static Objects from a Monocular Video".
![image](imgs/title_card.png)
* [Project Page](https://d2nerf.github.io/): https://d2nerf.github.io/
This codebase implements D2NeRF based on [HyperNeRF](https://github.com/google/hypernerf)
## Setup
The code can be run under any environment with Python 3.8 and above.Firstly, set up an environment via Miniconda or Anaconda:
conda create --name d2nerf python=3.8
Next, install the required packages:
pip install -r requirements.txt
Install the appropriate JAX distribution for your environment by [following the instructions here](https://github.com/google/jax#installation). For example:
pip install --upgrade "jax[cuda]" -f https://storage.googleapis.com/jax-releases/jax_releases.html
## Training
Please download our dataset [here](https://drive.google.com/drive/folders/1qm-8P6UqrhimZXp4USzFPumyfu8l1vto?usp=sharing).After unzipping the data, you can train with the following command:
export DATASET_PATH=/path/to/dataset
export EXPERIMENT_PATH=/path/to/save/experiment/to
export CONFIG_PATH=configs/rl/001.gin
python train.py \
--base_folder $EXPERIMENT_PATH \
--gin_bindings="data_dir='$DATASET_PATH'" \
--gin_configs $CONFIG_PATHTo plot telemetry to Tensorboard and render checkpoints on the fly, also
launch an evaluation job by running:python eval.py \
--base_folder $EXPERIMENT_PATH \
--gin_bindings="data_dir='$DATASET_PATH'" \
--gin_configs $CONFIG_PATHWe also provide an example script at `train_eval_balloon.sh`.
## Configuration
* Similiar to HyperNeRF, We use [Gin](https://github.com/google/gin-config) for configuration.
* We provide a couple of preset configurations:
- `configs/decompose/`: template configurations defining shared comfigurations for NeRF and HyperNeRF
- `configs/rl/`: configurations for experiments on real-life scenes.
- `configs/synthetic/`: configurations for experiments on synthetic scenes.
* Please refer to the paper appendix on arxiv to find out the configurations applied to each scene.
* Please refer to `config.py` for documentation on what each configuration does.## Dataset
The dataset uses the [same format as Nerfies](https://github.com/google/nerfies#datasets).For synthetic scenes generated using [Kubric](https://github.com/google-research/kubric), we also provide the worker script
, named `script.py` under each folder.## Running on own dataset
Because our code is fully compatiable with HyperNeRF dataset, thanks to them, you can simply use their [colab notebook](https://colab.research.google.com/github/google/nerfies/blob/main/notebooks/Nerfies_Capture_Processing.ipynb) to process your video and prepare a dataset for training.