Keywords: geometry, topology, signed distance functions, Riemannian optimization
Abstract: Learning signed distance functions (SDFs) on curved spaces is common in geometry processing and non-Euclidean machine learning, yet most "pull/projection" updates are ambient and ignore curvature. We study an intrinsic pull operator that advances along geodesics with an explicit normalization to tame gradient-magnitude errors. Our contributions are threefold: (i) a curvature-explicit contraction guarantee in a tubular neighborhood around the target surface, including a necessary--sufficient step-size window and a closed-form optimal step; (ii) a practical path-safety rule and an off-band Lyapunov "funneling" effect that make the update usable from realistic initializations; and (iii) a discretization-aware refinement that absorbs mesh, distance, transport and retraction errors into measurable constants. The result is a small, verifiable recipe: an intrinsic step, and theory that predicts what one should observe in practice.
Submission Number: 166
Loading