Cross-View Yaw Estimation in Location Uncertainty with Line-Aligning Yaw Scoring

03 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: cross-view localization, rotation estimation, aerial view
Abstract: Accurate rotation estimation is crucial in autonomous navigation and AR/MR (Augmented/Mixed Reality) applications. Small angular errors can lead to significant misalignment or navigation failures. Among the three rotation angles—pitch, roll, and yaw—yaw is the most challenging to estimate, as it lacks direct geometric cues, such as gravity-aligned structures. Yaw estimation given a BEV (Bird’s Eye View) image is treated as an inseparable cross-view localization problem that accompanies location and inevitably hypothesizes the height and distance of the ground pixel. We introduce LAYS, a line-alinging yaw scoring approach that enables precise yaw estimation. We propose a 3D voting-based search that effectively isolates the 1-DoF yaw component, enabling robust estimation without relying on ground-truth position or assuming ground height. In our method, BEV pixels are matched to a ground view column based on feature similarity. Using the relative yaw of the ground column, match scores are assigned to a yaw bin for each 2D pose pixel. To address location uncertainty, our method identifies line correspondence between the ground and BEV, and formulates the problem such that one such correspondence is sufficient to determine yaw. LAYS~achieves state-of-the-art sub-degree yaw accuracy, improving from 6.55\% to 34.81\% on the Mapillary Geo-Localization dataset, 41.36\% to 67.05\% on the Ford Multi-AV dataset, and 12.39\% to 23.67\% on the VIGOR dataset, setting a new benchmark for precise localization in real-world scenarios.
Primary Area: applications to computer vision, audio, language, and other modalities
Submission Number: 1205
Loading