CADSim: Robust and Scalable in-the-wild 3D Reconstruction for Controllable Sensor SimulationDownload PDF

16 Jun 2022, 10:45 (modified: 17 Nov 2022, 16:54)CoRL 2022 PosterReaders: Everyone
Student First Author: yes
Keywords: 3D Reconstruction, CAD models, Sensor Simulation, Self-Driving
TL;DR: We propose a new method to reconstruct objects from sensory observations that are of high fidelity, part-aware, geometry-aligned and compatible to graphics engine thus enable realistic and controllable simulation efficiently
Abstract: Realistic simulation is key to enabling safe and scalable development of self-driving vehicles. A core component is simulating the sensors so that the entire autonomy system can be tested in simulation. Sensor simulation involves modeling traffic participants, such as vehicles, with high-quality appearance and articulated geometry, and rendering them in real-time. The self-driving industry has employed artists to build these assets. However, this is expensive, slow, and may not reflect reality. Instead, reconstructing assets automatically from sensor data collected in the wild would provide a better path to generating a diverse and large set that provides good real-world coverage. However, current reconstruction approaches struggle on in-the-wild sensor data, due to its sparsity and noise. To tackle these issues, we present CADSim which combines part-aware object-class priors via a small set of CAD models with differentiable rendering to automatically reconstruct vehicle geometry, including articulated wheels, with high-quality appearance. Our experiments show our approach recovers more accurate shape from sparse data compared to existing approaches. Importantly, it also trains and renders efficiently. We demonstrate our reconstructed vehicles in a wide range of applications, including accurate testing of autonomy perception systems.
Supplementary Material: zip
5 Replies