K-Radar: 4D Radar Object Detection for Autonomous Driving in Various Weather ConditionsDownload PDF

Published: 17 Sept 2022, Last Modified: 03 Jul 2024NeurIPS 2022 Datasets and Benchmarks Readers: Everyone
Keywords: 4D Radar, 4D Radar tensor, 3D object detection, Adverse weather, Autonomous driving
Abstract: Unlike RGB cameras that use visible light bands (384∼769 THz) and Lidars that use infrared bands (361∼331 THz), Radars use relatively longer wavelength radio bands (77∼81 GHz), resulting in robust measurements in adverse weathers. Unfortunately, existing Radar datasets only contain a relatively small number of samples compared to the existing camera and Lidar datasets. This may hinder the development of sophisticated data-driven deep learning techniques for Radar-based perception. Moreover, most of the existing Radar datasets only provide 3D Radar tensor (3DRT) data that contain power measurements along the Doppler, range, and azimuth dimensions. As there is no elevation information, it is challenging to estimate the 3D bounding box of an object from 3DRT. In this work, we introduce KAIST-Radar (K-Radar), a novel large-scale object detection dataset and benchmark that contains 35K frames of 4D Radar tensor (4DRT) data with power measurements along the Doppler, range, azimuth, and elevation dimensions, together with carefully annotated 3D bounding box labels of objects on the roads. K-Radar includes challenging driving conditions such as adverse weathers (fog, rain, and snow) on various road structures (urban, suburban roads, alleyways, and highways). In addition to the 4DRT, we provide auxiliary measurements from carefully calibrated high-resolution Lidars, surround stereo cameras, and RTK-GPS. We also provide 4DRT-based object detection baseline neural networks (baseline NNs) and show that the height information is crucial for 3D object detection. And by comparing the baseline NN with a similarly-structured Lidar-based neural network, we demonstrate that 4D Radar is a more robust sensor for adverse weather conditions. All codes are available at https://github.com/kaist-avelab/k-radar.
Author Statement: Yes
URL: https://github.com/kaist-avelab/k-radar
Dataset Url: https://github.com/kaist-avelab/k-radar Supplementary video clips Urls (1) Sensor Measurements Dynamically Changing during Driving under the Heavy Snow Condition: https://www.youtube.com/watch?v=TZh5i2eLp1k&t=117s (2) 4D Radar Tensor & Lidar Point Cloud Calibration and Annotation Process https://www.youtube.com/watch?v=ylG0USHCBpU&t=166s (3) Annotation Process in the Absence of Lidar Point Cloud Measurements of Objects https://www.youtube.com/watch?v=ILlBJJpm4_4&t=15s (4) 4D Radar Tensor & Lidar Point Cloud Calibration Results https://www.youtube.com/watch?v=U4qkaMSJOds&t=18s (5) GUI-based Program for Visualization and Neural Network Inference https://www.youtube.com/watch?v=MrFPvO1ZjTY&t=10s (6) Information on Tracking for Multiple Objects on the Roads https://www.youtube.com/watch?v=8mqxf58_ZAk
Dataset Embargo: We will open a project page promptly, but the dataset will be opened end of September 2022. Since our dataset requires a large amount of memory (~13TB), we expect it will take about three months to set up the server.
Supplementary Material: pdf
License: The K-Radar dataset is published under the CC BY-NC-ND License, and all codes are published under the Apache License 2.0.
Contribution Process Agreement: Yes
In Person Attendance: Yes
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2206.08171/code)
43 Replies

Loading