Lunar Surface Reconstruction via Neural Rendering of KPLO LUTI Imagery

Published: 10 Mar 2025, Last Modified: 03 Aug 2025The 56th Lunar and Planetary Science Conference (LPSC)EveryoneCC BY 4.0
Abstract: High-precision digital elevation models (DEMs) of the Moon are essential for mission planning and scientific analysis. However, traditional photogrammetric methods often struggle to maintain spatial resolution and consistency when processing pushbroom camera imagery, which generates vast, continuous image lines. Furthermore, because most lunar images are acquired in a nadir (straight-down) orientation to support cartographic and geological research, limited parallax—that is, minimal or ambiguous differences in viewing angles between images—makes it difficult to derive accurate 3D structure from standard stereo matching. To address these challenges, we propose a custom neural rendering framework based on a simplified Neural Radiance Field (NeRF). While NeRF typically requires multiple consistent images from different viewpoints for high-fidelity 3D rendering, our approach simulates multi-view geometry by treating each pushbroom image “line” as a distinct viewpoint and modeling the spacecraft’s orbit as a continuous path along the planetary surface—all from a single pass. We also show that our framework remains robust even when fewer pushbroom lines are available, making it particularly promising for large-scale reconstructions.
Loading