Learning to Transfer Heterogeneous Translucent Materials from a 2D Image to 3D Models

Published: 20 Jul 2024, Last Modified: 21 Jul 2024MM2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Great progress has been made in rendering translucent materials in recent years, but automatically estimating parameters for heterogeneous materials such as jade and human skin remains a challenging task, often requiring specialized and expensive physical measurement devices. In this paper, we present a novel approach for estimating and transferring the parameters of heterogeneous translucent materials from a single 2D image to 3D models. Our method consists of four key steps: (1) An efficient viewpoint selection algorithm to minimize redundancy and ensure comprehensive coverage of the model. (2) Initializing a homogeneous translucent material to render initial images for translucent dataset. (3) Edit the rendered translucent images to update the translucent dataset. (4) Optimize the edited translucent results onto material parameters using inverse rendering techniques. Our approach offers a practical and accessible solution that overcomes the limitations of existing methods, which often rely on complex and costly specialized devices. We demonstrate the effectiveness and superiority of our proposed method through extensive experiments, showcasing its ability to transfer and edit high-quality heterogeneous translucent materials on 3D models, surpassing the results achieved by previous techniques in 3D scene editing.
Primary Subject Area: [Generation] Generative Multimedia
Secondary Subject Area: [Experience] Art and Culture
Relevance To Conference: This paper presents a novel approach for transferring heterogeneous translucent material parameters from 2D images to 3D models. This work is highly relevant to multimedia, as it addresses key challenges in generative multimedia and artistic content creation. The proposed method enables the realistic reproduction of complex translucent materials, empowering artists and designers to incorporate visually striking elements into their digital creations. The paper's contributions to appearance modeling and material editing are expected to have a significant impact on the development of next-generation multimedia and artistic applications.
Supplementary Material: zip
Submission Number: 464
Loading