DarkGS: Learning Neural Illumination and 3D Gaussians Relighting for Robotic Exploration in the Dark

Published: 19 Apr 2024, Last Modified: 13 May 2024RoboNerF WS 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Gaussian Splatting; Robot Perception
TL;DR: This paper presents how to build 3D Gaussians from images taken in dark environment with camera and light source moving together.
Abstract: Humans have the remarkable ability to construct consistent mental models of an environment, even under limited or varying levels of illumination. We wish to endow robots with this same capability. In this paper, we tackle the challenge of constructing a photorealistic scene representation under poorly illuminated conditions and with a moving light source. We approach the task of modeling illumination as a learning problem and utilize the developed illumination model to aid in scene reconstruction. We introduce an innovative framework that uses a data-driven approach, Neural Light Simulators (NeLiS), to model and calibrate the camera-light system. Furthermore, we present DarkGS, a method that applies NeLiS to create a relightable 3D Gaussian scene model capable of real-time, photorealistic rendering from novel viewpoints.
Submission Number: 23
Loading