Keywords: 2D Gaussian Splatting, scene representation, Variable Illumination
Abstract: Modeling scenes under unknown, varying single-point illumination is crucial for applications such as interactive relighting, augmented reality, and robotics. However, existing dynamic novel-view synthesis methods struggle with changing lighting, often misinterpreting illumination-induced appearance variations (e.g., cast/self-shadows), which complicates optimization and degrades reconstruction quality. In this paper, we present a point-based framework that explicitly models dynamic scenes under rapidly changing single-point light sources, enabling accurate reconstruction of illumination effects and realistic novel-view rendering under arbitrary new lighting conditions. Our approach builds on 2D Gaussian splatting, augmenting each Gaussian splat with learned BRDF parameters and leveraging physics-based differentiable rendering to decouple reflectance and illumination for each point. We further integrate shadow-mapping module into the rendering pipeline to capture accurate shadows for novel light positions. Experiments on dynamic scenes demonstrate that our method faithfully reproduces complex lighting and shadow variations in novel views.
Supplementary Material: zip
Primary Area: applications to computer vision, audio, language, and other modalities
Submission Number: 7094
Loading