TrinitySLAM: On-board Real-time Event-image Fusion SLAM System for Drones

Xinjun Cai, Jingao Xu, Kuntian Deng, Hongbo Lan, Yue Wu, Xiangwen Zhuge, Zheng Yang

Published: 2024, Last Modified: 07 May 2026ACM Trans. Sens. Networks 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Drones have witnessed extensive popularity among diverse smart applications, and visual Simultaneous Localization and Mapping (SLAM) technology is commonly used to estimate the six-degrees-of-freedom pose for drone flight control systems. However, traditional image-based SLAM cannot ensure the flight safety of drones, especially in challenging environments such as high-speed flight and high dynamic range scenarios. The event camera, a new vision sensor, holds the potential to enable drones to overcome these challenging scenarios if fused with the image-based SLAM. Unfortunately, the computational demands of event-image fusion SLAM have grown manifold compared with image-based SLAM. Existing research on visual SLAM acceleration cannot achieve real-time operation of event-image fusion SLAM on on-board computing platforms for drones. To fill this gap, we present TrinitySLAM, a high-accuracy, real-time, low-energy consumption event-image fusion SLAM acceleration framework utilizing Xilinx Zynq, an on-board heterogeneous computing platform. The key innovations of TrinitySLAM  include a fine-grained computation allocation strategy, several novel hardware–software co-acceleration designs, and an efficient data exchange mechanism. We fully implement TrinitySLAM  on the latest Zynq UltraScale+ platform and evaluate its performance on one custom-made drone dataset and four official datasets covering various scenarios. Comprehensive experiments show that TrinitySLAM  improves the pose estimation accuracy by 28% with half end-to-end latency and 1.2× energy consumption reduction compared with the most comparable state-of-the-art heterogeneous computing platform acceleration baseline.
Loading