Perceptual-GS: Scene-adaptive Perceptual Densification for Gaussian Splatting

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
Abstract: 3D Gaussian Splatting (3DGS) has emerged as a powerful technique for novel view synthesis. However, existing methods struggle to adaptively optimize the distribution of Gaussian primitives based on scene characteristics, making it challenging to balance reconstruction quality and efficiency. Inspired by human perception, we propose scene-adaptive perceptual densification for Gaussian Splatting (Perceptual-GS), a novel framework that integrates perceptual sensitivity into the 3DGS training process to address this challenge. We first introduce a perception-aware representation that models human visual sensitivity while constraining the number of Gaussian primitives. Building on this foundation, we develop a perceptual sensitivity-adaptive distribution to allocate finer Gaussian granularity to visually critical regions, enhancing reconstruction quality and robustness. Extensive evaluations on multiple datasets, including BungeeNeRF for large-scale scenes, demonstrate that Perceptual-GS achieves state-of-the-art performance in reconstruction quality, efficiency, and robustness. The code is publicly available at: https://github.com/eezkni/Perceptual-GS
Lay Summary: Efficient and high-quality 3D scene reconstruction remains a significant challenge in computer vision. We investigate whether theories related to human perception can guide the trade-off between quality and efficiency in 3D scene reconstruction. Building on the theory that the Human Visual System is highly sensitive to complex textures but less so to smooth areas, we introduce Perceptual-GS. This approach ensures that the reconstructed 3D scenes align with characteristics of human perception. Our experiments show that Perceptual-GS significantly outperforms existing methods in reconstruction quality, particularly in perceptual quality, while also achieving superior efficiency in terms of both storage and computation. Moreover, when combined with other techniques, our method demonstrates enhanced effectiveness, highlighting its potential for a broader range of 3D reconstruction applications.
Application-Driven Machine Learning: This submission is on Application-Driven Machine Learning.
Link To Code: https://github.com/eezkni/Perceptual-GS
Primary Area: Applications->Computer Vision
Keywords: Novel View Synthesis, 3D Gaussian Splatting, Adaptive Density Control, Human Visual System
Submission Number: 3146
Loading