Directional Distance Field for Modeling the Difference between 3D Point Clouds

19 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: representation learning for computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: 3D point cloud
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: An efficient yet effective distance metric computing the difference between the underlying 3D surfaces of point clouds to empower point cloud modeling
Abstract: Quantifying the dissimilarity between two unstructured 3D point clouds is challenging yet essential, with existing metrics often relying on measuring the distance between corresponding points which can be either inefficient or ineffective. In this paper, we propose a novel distance metric called directional distance field (DDF), which computes the difference between the underlying 3D surfaces calibrated and induced by a set of reference points. By associating each reference point with two given point clouds through computing its directional distances to them, the difference in directional distances of an identical reference point characterizes the geometric difference between a typical local region of the two point clouds. Finally, DDF is obtained by averaging the directional distance differences of all reference points. We evaluate DDF on various optimization and unsupervised learning-based tasks, including shape reconstruction, rigid registration, scene flow estimation, and feature representation. Extensive experiments show that DDF achieves significantly higher accuracy under all tasks in a memory and computationally efficient manner, compared with existing metrics. As a generic metric, DDF can unleash the potential of optimization and learning-based frameworks for 3D point cloud processing and analysis. We include the source code in the supplementary material.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 1576
Loading