Style2Shape: Image Style Guided 3D Shape Material Generation

ICLR 2026 Conference Submission25608 Authors

20 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Material Generation; Differentiable Rendering; Procedural Materials; Appearance Transfer; Physically-Based Rendering
Abstract: This paper presents Style2Shape, a novel framework for generating physically-based rendering (PBR) materials for 3D models from a single reference image. Unlike existing methods limited by the diversity of procedural material libraries or producing non-editable representations, our approach combines procedural ma-terials with generated textures via differentiable rendering. Our key insight is that procedural parameters ensure reflectance correctness while generated textures capture arbitrary appearances-their learnable combination achieves both physi-cal plausibility and visual fidelity. The framework operates in three stages: (1) structure-guided appearance transfer that synthesizes geometrically-aligned su-pervision, (2) hybrid PBR material initialization that retrieves procedural mate-rials based on physical properties and generates complementary textures for ap-pearance details, and (3) physics-based optimization jointly refining all compo-nents through differentiable rendering. Extensive experiments demonstrate that our approach generates high-fidelity results, producing editable PBR materials that faithfully reproduce reference appearances while maintaining physical plau-sibility. The generated assets are structured to be compatible with standard 3D rendering workflows.
Supplementary Material: zip
Primary Area: applications to computer vision, audio, language, and other modalities
Submission Number: 25608
Loading