Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/ChikaYan/d2nerf


https://github.com/ChikaYan/d2nerf

Last synced: about 22 hours ago
JSON representation

Awesome Lists containing this project

README

        

# D2NeRF: Self-Supervised Decoupling of Dynamic and Static Objects from a Monocular Video

This is the code for "D2NeRF: Self-Supervised Decoupling of Dynamic and Static Objects from a Monocular Video".

![image](imgs/title_card.png)

* [Project Page](https://d2nerf.github.io/): https://d2nerf.github.io/

This codebase implements D2NeRF based on [HyperNeRF](https://github.com/google/hypernerf)

## Setup
The code can be run under any environment with Python 3.8 and above.

Firstly, set up an environment via Miniconda or Anaconda:

conda create --name d2nerf python=3.8

Next, install the required packages:

pip install -r requirements.txt

Install the appropriate JAX distribution for your environment by [following the instructions here](https://github.com/google/jax#installation). For example:

pip install --upgrade "jax[cuda]" -f https://storage.googleapis.com/jax-releases/jax_releases.html

## Training
Please download our dataset [here](https://drive.google.com/drive/folders/1qm-8P6UqrhimZXp4USzFPumyfu8l1vto?usp=sharing).

After unzipping the data, you can train with the following command:

export DATASET_PATH=/path/to/dataset
export EXPERIMENT_PATH=/path/to/save/experiment/to
export CONFIG_PATH=configs/rl/001.gin
python train.py \
--base_folder $EXPERIMENT_PATH \
--gin_bindings="data_dir='$DATASET_PATH'" \
--gin_configs $CONFIG_PATH

To plot telemetry to Tensorboard and render checkpoints on the fly, also
launch an evaluation job by running:

python eval.py \
--base_folder $EXPERIMENT_PATH \
--gin_bindings="data_dir='$DATASET_PATH'" \
--gin_configs $CONFIG_PATH

We also provide an example script at `train_eval_balloon.sh`.

## Configuration
* Similiar to HyperNeRF, We use [Gin](https://github.com/google/gin-config) for configuration.
* We provide a couple of preset configurations:
- `configs/decompose/`: template configurations defining shared comfigurations for NeRF and HyperNeRF
- `configs/rl/`: configurations for experiments on real-life scenes.
- `configs/synthetic/`: configurations for experiments on synthetic scenes.
* Please refer to the paper appendix on arxiv to find out the configurations applied to each scene.
* Please refer to `config.py` for documentation on what each configuration does.

## Dataset
The dataset uses the [same format as Nerfies](https://github.com/google/nerfies#datasets).

For synthetic scenes generated using [Kubric](https://github.com/google-research/kubric), we also provide the worker script
, named `script.py` under each folder.

## Running on own dataset

Because our code is fully compatiable with HyperNeRF dataset, thanks to them, you can simply use their [colab notebook](https://colab.research.google.com/github/google/nerfies/blob/main/notebooks/Nerfies_Capture_Processing.ipynb) to process your video and prepare a dataset for training.