Towards Event-Based Visual Sensing for Spacecraft Pose Estimation: Baselines and Analysis
Keywords: event cameras, spacecraft pose estimation, neuromorphic vision, space situational awareness, SPARK2026, domain gap, robust perception, high dynamic range
TL;DR: Event-based vision maintains spacecraft pose estimation errors below 0.5m across all orbital lighting conditions while frame-based methods degrade 3-4x, demonstrating event cameras as a robust alternative for space perception
Abstract: Accurate six-degree-of-freedom pose estimation of uncooperative spacecraft is critical for autonomous proximity operations, yet extreme orbital lighting—eclipses, solar glare, and Earth albedo—severely degrades conventional frame-based vision systems. Event cameras, which asynchronously report per-pixel brightness changes with high dynamic range exceeding 120 dB, offer a promising alternative. We establish baselines for event-based spacecraft pose estimation in the context of the SPARK2026 Challenge. Using a synthetic simulation pipeline, we evaluate an Event Frame CNN, a Voxel Grid CNN, and a conventional Frame-Based CNN across four lighting conditions. While the frame-based approach achieves the lowest error under nominal lighting (0.15 m translation, 2.5° rotation), its performance degrades by 3–4x in eclipse and high-contrast scenarios. Event-based methods maintain translation errors below 0.5 m across all conditions, reducing error by up to 40% relative to the frame-based baseline under eclipse. These findings highlight event cameras as a robust sensing modality for spacecraft pose estimation.
Email Sharing: We authorize the sharing of all author emails with Program Chairs.
Data Release: We authorize the release of our submission and author names to the public in the event of acceptance.
Submission Number: 56
Loading