Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/ispc-lab/LiDAR4D
๐ซ [CVPR 2024] LiDAR4D: Dynamic Neural Fields for Novel Space-time View LiDAR Synthesis
https://github.com/ispc-lab/LiDAR4D
autonomous-driving computer-vision cvpr2024 dynamic-scene lidar neural-rendering novel-view-synthesis point-cloud reconstruction
Last synced: about 2 months ago
JSON representation
๐ซ [CVPR 2024] LiDAR4D: Dynamic Neural Fields for Novel Space-time View LiDAR Synthesis
- Host: GitHub
- URL: https://github.com/ispc-lab/LiDAR4D
- Owner: ispc-lab
- License: apache-2.0
- Created: 2024-03-06T07:39:59.000Z (10 months ago)
- Default Branch: main
- Last Pushed: 2024-06-18T14:59:10.000Z (6 months ago)
- Last Synced: 2024-08-01T03:33:15.826Z (5 months ago)
- Topics: autonomous-driving, computer-vision, cvpr2024, dynamic-scene, lidar, neural-rendering, novel-view-synthesis, point-cloud, reconstruction
- Language: Python
- Homepage: https://dyfcalid.github.io/LiDAR4D
- Size: 86.9 KB
- Stars: 125
- Watchers: 4
- Forks: 10
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
LiDAR4D: Dynamic Neural Fields for Novel Space-time View LiDAR Synthesis
[Zehan Zheng](https://dyfcalid.github.io/), [Fan Lu](https://fanlu97.github.io/), Weiyi Xue, [Guang Chen](https://ispc-group.github.io/)โ , Changjun Jiang (โ Corresponding author)
**CVPR 2024****[Paper (arXiv)](https://arxiv.org/abs/2404.02742) | [Paper (CVPR)](https://openaccess.thecvf.com/content/CVPR2024/html/Zheng_LiDAR4D_Dynamic_Neural_Fields_for_Novel_Space-time_View_LiDAR_Synthesis_CVPR_2024_paper.html) | [Project Page](https://dyfcalid.github.io/LiDAR4D) | [Video](https://www.youtube.com/watch?v=E6XyG3A3EZ8) | [Poster](https://drive.google.com/file/d/13cf0rSjCjGRyBsYOcQSa6Qf1Oe1a5QCy/view?usp=sharing) | [Slides](https://drive.google.com/file/d/1Q6yTVGoBf_nfWR4rW9RcSGlxRMufmSXc/view?usp=sharing)**
This repository is the official PyTorch implementation for LiDAR4D.
Table of Contents
## Changelog
2024-6-1:๐น๏ธ We release the simulator for easier rendering and manipulation. *Happy Children's Day and Have Fun!*
2024-5-4:๐ We update flow fields and improve temporal interpolation.
2024-4-13:๐ We update U-Net of LiDAR4D for better ray-drop refinement.
2024-4-5:๐ Code of LiDAR4D is released.
2024-4-4:๐ฅ You can reach the preprint paper on arXiv as well as the project page.
2024-2-27:๐ Our paper is accepted by CVPR 2024.## Demo
## Introduction
LiDAR4D is a differentiable LiDAR-only framework for novel space-time LiDAR view synthesis, which reconstructs dynamic driving scenarios and generates realistic LiDAR point clouds end-to-end. It adopts 4D hybrid neural representations and motion priors derived from point clouds for geometry-aware and time-consistent large-scale scene reconstruction.
## Getting started
### ๐ ๏ธ Installation
```bash
git clone https://github.com/ispc-lab/LiDAR4D.git
cd LiDAR4Dconda create -n lidar4d python=3.9
conda activate lidar4d# PyTorch
# CUDA 12.1
pip install torch==2.1.0 torchvision==0.16.0 torchaudio==2.1.0 --index-url https://download.pytorch.org/whl/cu121
# CUDA 11.8
# pip install torch==2.1.0 torchvision==0.16.0 torchaudio==2.1.0 --index-url https://download.pytorch.org/whl/cu118
# CUDA <= 11.7
# pip install torch==2.0.0 torchvision torchaudio# Dependencies
pip install -r requirements.txt# Local compile for tiny-cuda-nn
git clone --recursive https://github.com/nvlabs/tiny-cuda-nn
cd tiny-cuda-nn/bindings/torch
python setup.py install# compile packages in utils
cd utils/chamfer3D
python setup.py install
```### ๐ Dataset
#### KITTI-360 dataset ([Download](https://www.cvlibs.net/datasets/kitti-360/download.php))
We use sequence00 (`2013_05_28_drive_0000_sync`) for experiments in our paper.
Download KITTI-360 dataset (2D images are not needed) and put them into `data/kitti360`.
(or use symlinks: `ln -s DATA_ROOT/KITTI-360 ./data/kitti360/`).
The folder tree is as follows:```bash
data
โโโ kitti360
โโโ KITTI-360
โโโ calibration
โโโ data_3d_raw
โโโ data_poses
```Next, run KITTI-360 dataset preprocessing: (set `DATASET` and `SEQ_ID`)
```bash
bash preprocess_data.sh
```After preprocessing, your folder structure should look like this:
```bash
configs
โโโ kitti360_{sequence_id}.txt
data
โโโ kitti360
โโโ KITTI-360
โ โโโ calibration
โ โโโ data_3d_raw
โ โโโ data_poses
โโโ train
โโโ transforms_{sequence_id}test.json
โโโ transforms_{sequence_id}train.json
โโโ transforms_{sequence_id}val.json
```### ๐ Run LiDAR4D
Set corresponding sequence config path in `--config` and you can modify logging file path in `--workspace`. Remember to set available GPU ID in `CUDA_VISIBLE_DEVICES`.
Run the following command:
```bash
# KITTI-360
bash run_kitti_lidar4d.sh
```## ๐ Results
**KITTI-360 *Dynamic* Dataset** (Sequences: `2350` `4950` `8120` `10200` `10750` `11400`)
Method
Point Cloud
Depth
Intensity
CDโ
F-Scoreโ
RMSEโ
MedAEโ
LPIPSโ
SSIMโ
PSNRโ
RMSEโ
MedAEโ
LPIPSโ
SSIMโ
PSNRโ
LiDAR-NeRF
0.1438
0.9091
4.1753
0.0566
0.2797
0.6568
25.9878
0.1404
0.0443
0.3135
0.3831
17.1549
LiDAR4D (Ours) โ
0.1002
0.9320
3.0589
0.0280
0.0689
0.8770
28.7477
0.0995
0.0262
0.1498
0.6561
20.0884
**KITTI-360 *Static* Dataset** (Sequences: `1538` `1728` `1908` `3353`)
Method
Point Cloud
Depth
Intensity
CDโ
F-Scoreโ
RMSEโ
MedAEโ
LPIPSโ
SSIMโ
PSNRโ
RMSEโ
MedAEโ
LPIPSโ
SSIMโ
PSNRโ
LiDAR-NeRF
0.0923
0.9226
3.6801
0.0667
0.3523
0.6043
26.7663
0.1557
0.0549
0.4212
0.2768
16.1683
LiDAR4D (Ours) โ
0.0834
0.9312
2.7413
0.0367
0.0995
0.8484
29.3359
0.1116
0.0335
0.1799
0.6120
19.0619
โ : The latest results better than the paper.
*Experiments are conducted on the NVIDIA 4090 GPU. Results may be subject to some variation and randomness.*## ๐น๏ธ Simulation
After reconstruction, you can use the simulator to render and manipulate LiDAR point clouds in the whole scenario. It supports dynamic scene re-play, novel LiDAR configurations (`--fov_lidar`, `--H_lidar`, `--W_lidar`) and novel trajectory (`--shift_x`, `--shift_y`, `--shift_z`).
We also provide a simple demo setting to transform LiDAR configurations from KITTI-360 to NuScenes, using `--kitti2nus` in the bash script.
Check the sequence config and corresponding workspace and model path (`--ckpt`).
Run the following command:
```bash
bash run_kitti_lidar4d_sim.sh
```
The results will be saved in the workspace folder.## Acknowledgement
We sincerely appreciate the great contribution of the following works:
- [tiny-cuda-nn](https://github.com/NVlabs/tiny-cuda-nn/tree/master)
- [LiDAR-NeRF](https://github.com/tangtaogo/lidar-nerf)
- [NFL](https://research.nvidia.com/labs/toronto-ai/nfl/)
- [K-Planes](https://github.com/sarafridov/K-Planes)## Citation
If you find our repo or paper helpful, feel free to support us with a star ๐ or use the following citation:
```bibtex
@inproceedings{zheng2024lidar4d,
title = {LiDAR4D: Dynamic Neural Fields for Novel Space-time View LiDAR Synthesis},
author = {Zheng, Zehan and Lu, Fan and Xue, Weiyi and Chen, Guang and Jiang, Changjun},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
year = {2024}
}
```## License
All code within this repository is under [Apache License 2.0](https://www.apache.org/licenses/LICENSE-2.0).