Pro3D-Editor: A Progressive Framework for Consistent and Precise 3D Editing

Published: 18 Sept 2025, Last Modified: 29 Oct 2025NeurIPS 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: 3D Edit, 3D Gaussian Splatting, multi-view diffusion model
TL;DR: We propose a novel progressive 3D editing framework to achieve consistent and precise 3D editing
Abstract: Text-guided 3D editing aims to locally modify 3D objects based on editing prompts, which has significant potential for applications in 3D game and film domains. Existing methods typically follow a view-agnostic paradigm: editing 2D view images indiscriminately and projecting them back into 3D space. However, the view-agnostic paradigm neglects view consistency and view-specific characteristics, resulting in spatial inconsistencies and imprecise control over edited regions. In this study, we argue that progressive view-oriented paradigm can effectively address these issues, which projects the editing information from a editing-sensitive view to other editing-insensitive views. Based on this paradigm, we design Pro3D-Editor, a new framework. Extensive experiments demonstrate that our method outperforms existing approaches in terms of editing accuracy and spatial consistency.
Supplementary Material: zip
Primary Area: Applications (e.g., vision, language, speech and audio, Creative AI)
Submission Number: 1340
Loading