Towards Onboard Multi-View Analysis of Satellite Imagery: Monocular 2D Motion Estimation and 3D Reconstruction

Published: 26 Apr 2026, Last Modified: 26 Apr 2026AI4SpaceEveryoneRevisionsCC BY 4.0
Keywords: Earth Observation, Edge AI, Monocular Multi-View Analysis, 3D Scene Reconstruction, Deep Learning for Optical Flow
TL;DR: A framework designed to perform monocular multi-view image analysis directly onboard satellite edge processors, validated on real orbital data for upcoming flight.
Abstract: Driven by high-performance processors, modern Low Earth Orbit (LEO) satellites can now execute AI and deep learning workloads in orbit. This shift enables spacecraft to process sensor data into actionable insights rather than serving as mere conduits for downlinking. However, current onboard AI predominantly focuses on single-image analysis tasks like segmentation and detection. These methods fail to leverage the motion and geometric information inherent in multi-view imagery, which is crucial for characterizing dynamic volumetric phenomena—such as clouds and plumes—and for enabling autonomous, real-time monitoring without the latency of ground-station downlink. Multi-view analysis and photogrammetry remain largely constrained to ground-based processing due to their high computational demands or reliance on large external datasets. To address this gap, we present a framework for onboard monocular multi-view analysis designed for edge deployment; it performs 2D motion estimation—exploring both classical and learning-based methods—and subsequent geometric 3D scene reconstruction. This work represents a pioneering effort in transitioning multi-view analysis directly onboard Earth Observation (EO) satellites. We validate our framework using real-world data from the [Anonymous] mission, a satellite testbed for AI flight software, paving the way for upcoming deployment on this and other platforms with advanced compute.
Email Sharing: We authorize the sharing of all author emails with Program Chairs.
Data Release: We authorize the release of our submission and author names to the public in the event of acceptance.
Submission Number: 38
Loading