VAD4Space: Visual Anomaly Detection for Planetary Surface Imagery

Published: 26 Apr 2026, Last Modified: 26 Apr 2026AI4SpaceEveryoneRevisionsCC BY 4.0
Keywords: Visual Anomaly Detection, Planetary Surface Imagery
TL;DR: AI can automatically spot rare geological anomalies in planetary imagery without needing labeled training examples, enabling smarter, onboard science discovery for lunar and Mars missions.
Abstract: Space missions generate massive volumes of high-resolution orbital and surface imagery that far exceed the capacity for manual inspection. Of particular scientific interest is the automated detection of rare phenomena, which may represent valuable discoveries or previously unanticipated surface processes. Traditional supervised learning approaches are fundamentally limited in this context by the scarcity of labeled rare events and by closed-world assumptions that preclude the discovery of truly novel observations. In this work, we investigate Visual Anomaly Detection (VAD) as a framework for automated discovery in planetary exploration. We present the first empirical evaluation of state-of-the-art feature-based VAD methods on real planetary imagery, encompassing both orbital lunar data and Mars rover surface imagery. To support this evaluation, we introduce two benchmarks: (i) a lunar dataset derived from Lunar Reconnaissance Orbiter Camera Narrow Angle imagery, comprising of fresh and degraded craters as anomalies alongside normal terrain; and (ii) a Mars surface dataset designed to reflect the characteristics of rover-acquired imagery. We evaluate multiple VAD approaches with a focus on computationally efficient, edge-oriented solutions suitable for onboard deployment, applicable to both orbital platforms surveying the lunar surface and surface rovers operating on Mars. Our results demonstrate that feature-based VAD methods can effectively identify rare planetary surface phenomena while remaining feasible for resource-constrained environments. By grounding anomaly detection in planetary science, this work establishes practical benchmarks and highlights the potential of open-world perception systems to support a range of mission-critical applications, including tactical planning, landing site selection, hazard detection, bandwidth-aware data prioritization, and the discovery of unanticipated geological processes.
Email Sharing: We authorize the sharing of all author emails with Program Chairs.
Data Release: We authorize the release of our submission and author names to the public in the event of acceptance.
Submission Number: 31
Loading