Benchmarking Localization, 3D Reconstruction and Radiance Fields for Navigation Across Day and Night
Keywords: Benchmark Datasets, Oxford, Day-Night Illumination, Localization, 3D Reconstruction, Radiance Fields, Novel View Synthesis, Visual Relocalization
TL;DR: Two Oxford datasets: one with survey-grade 3D ground truth, one with egocentric day–night video. They benchmark SLAM, reconstruction, radiance fields, and relocalization under illumination changes.
Abstract: For robots operating across day and night, the ability to localize and model the environment under varying illumination is essential. We present two complementary datasets for developing and benchmarking perception algorithms, including localization, 3D reconstruction, and novel-view synthesis in large-scale indoor and outdoor environments. The first, Oxford Spires Dataset, provides multi-sensor recordings around historical Oxford landmarks, paired with millimeter-accurate 3D ground-truth maps enabling precise trajectory estimation. The second, Oxford Day and Night Dataset, captures egocentric video in the same areas under diverse lighting conditions, from daylight to nighttime. Together, these datasets offer a unique platform for advancing robust perception methods capable of handling challenging illumination changes.
Submission Number: 10
Loading