Limitations for Learning from Point CloudsDownload PDF

25 Sept 2019 (modified: 05 May 2023)ICLR 2020 Conference Blind SubmissionReaders: Everyone
TL;DR: We prove new universal approximation theorems for PointNets and DeepSets and demonstrate new limitations.
Abstract: In this paper we prove new universal approximation theorems for deep learning on point clouds that do not assume fixed cardinality. We do this by first generalizing the classical universal approximation theorem to general compact Hausdorff spaces and then applying this to the permutation-invariant architectures presented in 'PointNet' (Qi et al) and 'Deep Sets' (Zaheer et al). Moreover, though both architectures operate on the same domain, we show that the constant functions are the only functions they can mutually uniformly approximate. In particular, DeepSets architectures cannot uniformly approximate the diameter function but can uniformly approximate the center of mass function but it is the other way around for PointNet.
Keywords: universal approximation, point clouds, deep learning, hausdorff metric, wasserstein metric
Original Pdf: pdf
8 Replies

Loading