Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/ispc-lab/LiDAR4D

๐Ÿ’ซ [CVPR 2024] LiDAR4D: Dynamic Neural Fields for Novel Space-time View LiDAR Synthesis
https://github.com/ispc-lab/LiDAR4D

autonomous-driving computer-vision cvpr2024 dynamic-scene lidar neural-rendering novel-view-synthesis point-cloud reconstruction

Last synced: about 2 months ago
JSON representation

๐Ÿ’ซ [CVPR 2024] LiDAR4D: Dynamic Neural Fields for Novel Space-time View LiDAR Synthesis

Awesome Lists containing this project

README

        


LiDAR4D: Dynamic Neural Fields for Novel Space-time View LiDAR Synthesis

[Zehan Zheng](https://dyfcalid.github.io/), [Fan Lu](https://fanlu97.github.io/), Weiyi Xue, [Guang Chen](https://ispc-group.github.io/)โ€ , Changjun Jiang (โ€  Corresponding author)
**CVPR 2024**

**[Paper (arXiv)](https://arxiv.org/abs/2404.02742) | [Paper (CVPR)](https://openaccess.thecvf.com/content/CVPR2024/html/Zheng_LiDAR4D_Dynamic_Neural_Fields_for_Novel_Space-time_View_LiDAR_Synthesis_CVPR_2024_paper.html) | [Project Page](https://dyfcalid.github.io/LiDAR4D) | [Video](https://www.youtube.com/watch?v=E6XyG3A3EZ8) | [Poster](https://drive.google.com/file/d/13cf0rSjCjGRyBsYOcQSa6Qf1Oe1a5QCy/view?usp=sharing) | [Slides](https://drive.google.com/file/d/1Q6yTVGoBf_nfWR4rW9RcSGlxRMufmSXc/view?usp=sharing)**

This repository is the official PyTorch implementation for LiDAR4D.


Table of Contents



  1. Changelog


  2. Demo


  3. Introduction


  4. Getting started


  5. Results


  6. Simulation


  7. Citation

## Changelog
2024-6-1:๐Ÿ•น๏ธ We release the simulator for easier rendering and manipulation. *Happy Children's Day and Have Fun!*
2024-5-4:๐Ÿ“ˆ We update flow fields and improve temporal interpolation.
2024-4-13:๐Ÿ“ˆ We update U-Net of LiDAR4D for better ray-drop refinement.
2024-4-5:๐Ÿš€ Code of LiDAR4D is released.
2024-4-4:๐Ÿ”ฅ You can reach the preprint paper on arXiv as well as the project page.
2024-2-27:๐ŸŽ‰ Our paper is accepted by CVPR 2024.

## Demo

## Introduction

LiDAR4D is a differentiable LiDAR-only framework for novel space-time LiDAR view synthesis, which reconstructs dynamic driving scenarios and generates realistic LiDAR point clouds end-to-end. It adopts 4D hybrid neural representations and motion priors derived from point clouds for geometry-aware and time-consistent large-scale scene reconstruction.

## Getting started

### ๐Ÿ› ๏ธ Installation

```bash
git clone https://github.com/ispc-lab/LiDAR4D.git
cd LiDAR4D

conda create -n lidar4d python=3.9
conda activate lidar4d

# PyTorch
# CUDA 12.1
pip install torch==2.1.0 torchvision==0.16.0 torchaudio==2.1.0 --index-url https://download.pytorch.org/whl/cu121
# CUDA 11.8
# pip install torch==2.1.0 torchvision==0.16.0 torchaudio==2.1.0 --index-url https://download.pytorch.org/whl/cu118
# CUDA <= 11.7
# pip install torch==2.0.0 torchvision torchaudio

# Dependencies
pip install -r requirements.txt

# Local compile for tiny-cuda-nn
git clone --recursive https://github.com/nvlabs/tiny-cuda-nn
cd tiny-cuda-nn/bindings/torch
python setup.py install

# compile packages in utils
cd utils/chamfer3D
python setup.py install
```

### ๐Ÿ“ Dataset

#### KITTI-360 dataset ([Download](https://www.cvlibs.net/datasets/kitti-360/download.php))
We use sequence00 (`2013_05_28_drive_0000_sync`) for experiments in our paper.

Download KITTI-360 dataset (2D images are not needed) and put them into `data/kitti360`.
(or use symlinks: `ln -s DATA_ROOT/KITTI-360 ./data/kitti360/`).
The folder tree is as follows:

```bash
data
โ””โ”€โ”€ kitti360
โ””โ”€โ”€ KITTI-360
โ”œโ”€โ”€ calibration
โ”œโ”€โ”€ data_3d_raw
โ””โ”€โ”€ data_poses
```

Next, run KITTI-360 dataset preprocessing: (set `DATASET` and `SEQ_ID`)

```bash
bash preprocess_data.sh
```

After preprocessing, your folder structure should look like this:

```bash
configs
โ”œโ”€โ”€ kitti360_{sequence_id}.txt
data
โ””โ”€โ”€ kitti360
โ”œโ”€โ”€ KITTI-360
โ”‚ โ”œโ”€โ”€ calibration
โ”‚ โ”œโ”€โ”€ data_3d_raw
โ”‚ โ””โ”€โ”€ data_poses
โ”œโ”€โ”€ train
โ”œโ”€โ”€ transforms_{sequence_id}test.json
โ”œโ”€โ”€ transforms_{sequence_id}train.json
โ””โ”€โ”€ transforms_{sequence_id}val.json
```

### ๐Ÿš€ Run LiDAR4D

Set corresponding sequence config path in `--config` and you can modify logging file path in `--workspace`. Remember to set available GPU ID in `CUDA_VISIBLE_DEVICES`.
Run the following command:
```bash
# KITTI-360
bash run_kitti_lidar4d.sh
```

## ๐Ÿ“Š Results

**KITTI-360 *Dynamic* Dataset** (Sequences: `2350` `4950` `8120` `10200` `10750` `11400`)


Method
Point Cloud
Depth
Intensity


CDโ†“
F-Scoreโ†‘
RMSEโ†“
MedAEโ†“
LPIPSโ†“
SSIMโ†‘
PSNRโ†‘
RMSEโ†“
MedAEโ†“
LPIPSโ†“
SSIMโ†‘
PSNRโ†‘


LiDAR-NeRF
0.1438
0.9091
4.1753
0.0566
0.2797
0.6568
25.9878
0.1404
0.0443
0.3135
0.3831
17.1549


LiDAR4D (Ours) โ€ 
0.1002
0.9320
3.0589
0.0280
0.0689
0.8770
28.7477
0.0995
0.0262
0.1498
0.6561
20.0884


**KITTI-360 *Static* Dataset** (Sequences: `1538` `1728` `1908` `3353`)


Method
Point Cloud
Depth
Intensity


CDโ†“
F-Scoreโ†‘
RMSEโ†“
MedAEโ†“
LPIPSโ†“
SSIMโ†‘
PSNRโ†‘
RMSEโ†“
MedAEโ†“
LPIPSโ†“
SSIMโ†‘
PSNRโ†‘


LiDAR-NeRF
0.0923
0.9226
3.6801
0.0667
0.3523
0.6043
26.7663
0.1557
0.0549
0.4212
0.2768
16.1683


LiDAR4D (Ours) โ€ 
0.0834
0.9312
2.7413
0.0367
0.0995
0.8484
29.3359
0.1116
0.0335
0.1799
0.6120
19.0619

โ€ : The latest results better than the paper.
*Experiments are conducted on the NVIDIA 4090 GPU. Results may be subject to some variation and randomness.*

## ๐Ÿ•น๏ธ Simulation

After reconstruction, you can use the simulator to render and manipulate LiDAR point clouds in the whole scenario. It supports dynamic scene re-play, novel LiDAR configurations (`--fov_lidar`, `--H_lidar`, `--W_lidar`) and novel trajectory (`--shift_x`, `--shift_y`, `--shift_z`).
We also provide a simple demo setting to transform LiDAR configurations from KITTI-360 to NuScenes, using `--kitti2nus` in the bash script.
Check the sequence config and corresponding workspace and model path (`--ckpt`).
Run the following command:
```bash
bash run_kitti_lidar4d_sim.sh
```
The results will be saved in the workspace folder.

## Acknowledgement
We sincerely appreciate the great contribution of the following works:
- [tiny-cuda-nn](https://github.com/NVlabs/tiny-cuda-nn/tree/master)
- [LiDAR-NeRF](https://github.com/tangtaogo/lidar-nerf)
- [NFL](https://research.nvidia.com/labs/toronto-ai/nfl/)
- [K-Planes](https://github.com/sarafridov/K-Planes)

## Citation
If you find our repo or paper helpful, feel free to support us with a star ๐ŸŒŸ or use the following citation:
```bibtex
@inproceedings{zheng2024lidar4d,
title = {LiDAR4D: Dynamic Neural Fields for Novel Space-time View LiDAR Synthesis},
author = {Zheng, Zehan and Lu, Fan and Xue, Weiyi and Chen, Guang and Jiang, Changjun},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
year = {2024}
}
```

## License
All code within this repository is under [Apache License 2.0](https://www.apache.org/licenses/LICENSE-2.0).