Deep Regression on Manifolds: A 3D Rotation Case StudyDownload PDFOpen Website

2021 (modified: 07 Nov 2022)3DV 2021Readers: Everyone
Abstract: Many machine learning problems involve regressing variables on a non-Euclidean manifold–e.g. a discrete probability distribution, or the 6D pose of an object. One way to tackle these problems through gradient-based learning is to use a differentiable function that maps arbitrary inputs of a Euclidean space onto the manifold. In this paper, we establish a set of desirable properties for such mapping, and in particular highlight the importance of pre-images connectivity/convexity. We illustrate these properties with a case study regarding 3D rotations. Through theoretical considerations and methodological experiments on a variety of tasks, we review various differentiable mappings on the 3D rotation space, and conjecture about the importance of their local linearity. We show that a mapping based on Procrustes orthonormalization generally performs best among the mappings considered, but that a rotation vector representation might also be suitable when restricted to small angles.
0 Replies

Loading