VoxelScape: Large Scale Simulated 3D Point Cloud Dataset of Urban Traffic EnvironmentsDownload PDF

07 Jun 2021 (modified: 24 May 2023)Submitted to NeurIPS 2021 Datasets and Benchmarks Track (Round 1)Readers: Everyone
Keywords: Realistically simulated point cloud dataset
TL;DR: Realistically simulated point cloud dataset of urban traffic environments
Abstract: Having a profound understanding of the surrounding environment is considered one of the crucial tasks for the reliable operation of future self-driving cars. Light Detection and Ranging (LiDAR) sensor plays a critical role in achieving such understanding due to its capability to perceive the world in 3D. Similar to 2D perception tasks, current state-of-the-art methods in 3D perception tasks rely on deep neural networks (DNNs). However, the performance of 3D perception tasks, specially point-wise semantic segmentation, is not on par with their 2D counterparts. One of the main reasons is the lack of publicly available labelled 3D point cloud datasets (PCDs) from 3D LiDAR sensors. In this work, we are introducing the VoxelScape dataset, a large-scale simulated 3D PCD with 100K annotated point cloud scans. The annotations in the VoxelScape dataset includes both point-wise semantic labels and 3D bounding boxes labels. Additionally, we used a number of baseline approaches to validate the transferability of VoxelScape to real 3D PCD for two challenging 3D perception tasks. The promising results have shown that training DNNs on VoxelScape boosted the performance of the 3D perception tasks on the real PCD. The VoxelScape dataset is publicly available through https://voxel-scape.github.io/dataset/
Supplementary Material: zip
URL: https://voxel-scape.github.io/dataset/
4 Replies

Loading