TIMA SLAM: Tracking Independently and Mapping Altogether for an Uncalibrated Multi-Camera System

Published: 01 Jan 2021, Last Modified: 14 Nov 2024Sensors 2021EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: We present a novel simultaneous localization and mapping (SLAM) system that extends the state-of-the-art ORB-SLAM2 for multi-camera usage without precalibration. In this system, each camera is tracked independently on a shared map, and the extrinsic parameters of each camera in the fixed multi-camera system are estimated online up to a scalar ambiguity (for RGB cameras). Thus, the laborious precalibration of extrinsic parameters between cameras becomes needless. By optimizing the map, the keyframe poses, and the relative poses of the multi-camera system simultaneously, observations from the multiple cameras are utilized robustly, and the accuracy of the shared map is improved. The system is not only compatible with RGB sensors but also works on RGB-D cameras. For RGB cameras, the performance of the system tested on the well-known EuRoC/ASL and KITTI datasets that are in the stereo configuration for indoor and outdoor environments, respectively, as well as our dataset that consists of three cameras with small overlapping regions. For the RGB-D tests, we created a dataset that consists of two cameras for an indoor environment. The experimental results showed that the proposed method successfully provides an accurate multi-camera SLAM system without precalibration of the multi-cameras.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview