Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://tiers.github.io/multi_lidar_multi_uav_dataset/
https://tiers.github.io/multi_lidar_multi_uav_dataset/
Last synced: 3 months ago
JSON representation
- Host: GitHub
- URL: https://tiers.github.io/multi_lidar_multi_uav_dataset/
- Owner: TIERS
- License: mit
- Created: 2023-07-28T07:24:35.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-01-03T18:31:13.000Z (10 months ago)
- Last Synced: 2024-07-31T02:34:54.451Z (4 months ago)
- Language: C++
- Size: 10.7 MB
- Stars: 48
- Watchers: 0
- Forks: 5
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- Awesome-3D-LiDAR-Datasets - A Multi-LiDAR Multi-UAV Dataset - 64|1x Livox Mid, 1x Livox 360|Odom|Small| (Summary Table / Update: 2023-07-12)
README
Towards Robust UAV Tracking in GNSS-Denied Environments: A Multi-LiDAR Multi-UAV Dataset
Project Page
•
Paper
•
Contact Us
We present a novel multi-LiDAR dataset specifically designed for UAV tracking. Our dataset includes data from a spinning LiDAR, two solid-state LiDARs with different Field of View (FoV) and scan patterns, and an RGB-D camera. This diverse sensor suite allows for research on new challenges in the field, including limited FoV adaptability and multi-modality data processing. For a comprehensive list of sequences refer to the paper [Towards Robust UAV Tracking in GNSS-Denied Environments: A Multi-LiDAR Multi-UAV Dataset](https://arxiv.org/abs/2310.09165) and the [project page](https://tiers.github.io/multi_lidar_multi_uav_dataset )
## Calibration
We provide a ROS package to compute the extrinsic parameters between LiDARs and camera based on GICP. As the OS1 has the largest FOV, it is treated as base reference frame ("base_link") in which all the other point clouds are transformed. For the Avia, Mid-360 and Realsense D435, we integrated the first five frames to increase point cloud density.
To use this package, play the Calibration rosbag from our dataset:
~~~
rosbag play Calibration.bag -l
~~~
Then run our calibration launch file:
~~~
roslaunch multi_lidar_multi_uav_dataset lidars_extrinsic_computation.launch
~~~The computed extrinsic parameters will appear in the terminal:
~~~
OS -> base_link 0 0 0 0 0 0 /os_sensor /base_link 10
Avia -> base_link 0.149354 0.0423582 -0.0524961 3.13419 -3.13908 -3.13281 /avia_frame /base_link 10
Mid360 -> base_link 0.125546 -0.0554536 -0.20206 0.00467344 0.0270294 0.0494959 /mid360_frame /base_link 10
Camera -> base_link -0.172863 0.11895 -0.101785 1.55222 3.11188 1.60982 /camera_depth_optical_frame /base_link 10
~~~## Install
The code has been tested on Ubuntu 20.04 with ROS Noetic### Dependencies
- PCL- Eigen
- Livox_ros_driver, Follow [livox_ros_driver Installation](https://github.com/Livox-SDK/livox_ros_driver).
### Build
```
cd ~/catkin_ws/src
git clone https://github.com/TIERS/multi_lidar_multi_uav_dataset
cd ..
catkin build
```## Citation
If you use this dataset for any academic work, please cite the following publication:```
@inproceedings{catalano2023towards,
title={Towards robust uav tracking in gnss-denied environments: a multi-lidar multi-uav dataset},
author={Catalano, Iacopo and Yu, Xianjia and Queralta, Jorge Pe{\~n}a},
booktitle={2023 IEEE International Conference on Robotics and Biomimetics (ROBIO)},
pages={1--7},
year={2023},
organization={IEEE}
}
```