Keywords: Dense Correspondence, Generative Model, Neural Radiance Field
Abstract: Neural radiance field (NeRF), a kind of 3D shape representation, has shown promising results over building geometry and textures from images. However, unlike mesh or signed distance function (SDF) based representation, it remains an open problem to build correspondences across radiance fields, limiting its application in many downstream tasks.
Assumptions of prior arts on the availability of either correspondence annotations or 3D shapes as supervision signals do not apply to NeRF.
This paper shows that by leveraging rich structural priors encapsulated in a pretrained NeRF generative adversarial network (GAN), we can learn correspondence in a self-supervised manner without using any correspondence or 3D supervision.
To exploit the priors, we devise a novel Bijective Deformation Field (BDF), a way to establish a bijective shape deformation field for 3D radiance fields.
Our experiments demonstrate that the GAN-derived priors are discriminative enough to guide the learning of
accurate, smooth and robust 3D dense correspondence.
We also show that BDF can produce high-quality dense correspondences across different shapes belonging to the same object category.
We further demonstrate how the accurate correspondences facilitate downstream applications such as texture transfer, segmentation transfer, and deformation transfer. Code and models will be released.
1 Reply
Loading