Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/ZF4DRadSet/ZF-4DRadar-Dataset
ZF 4D Radar Dataset for Autonomous Driving
https://github.com/ZF4DRadSet/ZF-4DRadar-Dataset
Last synced: 6 days ago
JSON representation
ZF 4D Radar Dataset for Autonomous Driving
- Host: GitHub
- URL: https://github.com/ZF4DRadSet/ZF-4DRadar-Dataset
- Owner: ZF4DRadSet
- Created: 2023-02-13T06:41:16.000Z (almost 2 years ago)
- Default Branch: main
- Last Pushed: 2023-03-27T03:57:28.000Z (over 1 year ago)
- Last Synced: 2024-08-02T21:38:38.104Z (3 months ago)
- Size: 6.84 KB
- Stars: 48
- Watchers: 8
- Forks: 1
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
- awesome-radar-perception - Github
README
# ZF-4DRadar-Dataset
This repository provides a ZF FRGen21 4D radar dataset for autonomous driving object detection.
## Overview
* Introduction
* Sensor and Data
* Demos
* Data Release---
## Introduction
This data set contains 7,000 frames of 4D millimeter-wave radar data, and camera and lidar data are also collected and calibrated synchro-nously.Scenarios for this dataset include 4 typical scenes:Urban complex traffic scene, parking lot scene, tunnel scene, highway scene.And The category of object detection includes four classes:Car,Bicycle,Pedesrian and Truck_Bus.## Sensor and Data
The 4D radar used for our data is ZF FRGEN21 4D radar,It offers high detection performance in four dimensions (range, velocity, azimuth and elevation),can cover the object up to the distance of 200m.Our annoated format similiar with kitti dataset,contains the time stamp of the multi-sensor, calibration parameters, and the category of object detection and the position and size of the detection bounding box.We divided the data set containing 7000 frames into 5000 frames of train set, 1000 frames for validation set and test set separately.
## Demos
The demos below shows the object detection effect on the test set in four scenarios.## Data release
Due to policy, We will make the dataset publicly available in this year.TR