PROSE: Point Rendering of Sparse-Controlled Edits to Static Scenes

ICLR 2026 Conference Submission14050 Authors

18 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Static 3D Scene Editing, Point-based Novel View Synthesis, Proximity Attention Point Rendering
Abstract: Advances in neural rendering have enabled high-fidelity multi-view reconstruction and rendering of 3D scenes. However, current approaches of free-form shape editing can result in inaccurate and imprecise edits due to the need for proxy geometry or yield surface discontinuities in large deformations. In this work, we present a novel method based on a point-based neural renderer, PAPR, that addresses both issues -- no proxy geometry needs to be fitted and surface continuity is preserved after editing. Specifically, we design a novel way to guide shape editing with a set of sparse control points. We demonstrate that our method can effectively edit object shapes while preserving surface continuity and avoiding artifacts. Through extensive experiments on both synthetic and real-world datasets with various types of non-rigid shape edits, we show that our method consistently outperforms existing approaches.
Supplementary Material: zip
Primary Area: applications to computer vision, audio, language, and other modalities
Submission Number: 14050
Loading