Abstract: Object pose tracking is one of the pivotal technologies in multimedia, attracting ever-growing attention in recent years. Existing methods employing traditional cameras encounter numerous challenges such as motion blur, sensor noise, partial occlusion, and changing lighting conditions. The emerging bio-inspired sensors, particularly event cameras, possess advantages such as high dynamic range and low latency, which hold the potential to address the aforementioned challenges. In this work, we present an optical flow-guided 6DoF object pose tracking method with an event camera. A 2D-3D hybrid feature extraction strategy is firstly utilized to detect corners and edges from events and object models, which characterizes object motion precisely. Then, we search for the optical flow of corners by maximizing the event-associated probability within a spatio-temporal window, and establish the correlation between corners and edges guided by optical flow. Furthermore, by minimizing the distances between corners and edges, the 6DoF object pose is iteratively optimized to achieve continuous pose tracking. Experimental results of both simulated and real events demonstrate that our methods outperform event-based state-of-the-art methods in terms of both accuracy and robustness.
Primary Subject Area: [Experience] Multimedia Applications
Secondary Subject Area: [Content] Vision and Language
Relevance To Conference: Object pose tracking is a hot research topic in the robotics multimedia communities, with significant real-world applications, such as augmented reality, robotic grasping, and autonomous navigation. Nonetheless, traditional cameras are constrained by various factors like low frame rates and limited dynamic range. Hence, the challenges in object tracking persist, such as drastic lighting change, motion blur resulting from rapid object motion, partial occlusion among objects, and interference from cluttered backgrounds. The emerging bio-inspired sensors, particularly event cameras, possess advantages such as high dynamic range and low latency, which hold the potential to address the aforementioned challenges. In this paper, we propose an optical flow-guided 6DoF object pose tracking method using an event camera. Our method initially employs a hybrid feature extraction strategy, specifically detecting the object corners from the Time Surfaces (TSs) of events, and extracting the object edges from the projected point cloud. Then, we compute the optical flow of corners and search for correspondences with edges along the direction of the optical flow. Finally, we conceptualize object pose tracking as an optical flow-guided iterative optimization problem. The experimental results of simulated and real events indicate that our methods surpass the current state-of-the-art event-based approaches.
Supplementary Material: zip
Submission Number: 2119
Loading