Rehearsal NeRF: Disentangling Dynamic Illuminations in Neural Radiance Fields

22 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Scene representation, Scene decomposition, Neural radiance field, Novel view synthesis
TL;DR: A NeRF-based approach to decouple dynamic objects, static objects and dynamic lighting effects from the live performance scenes.
Abstract: Although there has been significant progress in neural radiance fields, an issue on dynamic illumination changes still remains unsolved. Different from relevant works that parameterize time-variant/-invariant components in scenes, subjects' radiance is highly entangled with their own emitted radiance and lighting colors in spatio-temporal domain. In this paper, we present a new effective method to render and reconstruct neural fields under severe illumination changes, named $\textit{ReHeaRF}$. Our key idea is to leverage scenes captured under stable lighting like rehearsal stages, easily taken before dynamic illumination occurs, to enforce geometric consistency between the different lighting conditions. In particular, ReHeaRF uses a learnable vector for lighting effects which represents illumination colors in a temporal dimension and is used to disentangle projected light colors from scene radiance. Furthermore, our ReHeaRF is also able to reconstruct the neural fields of dynamic objects by using off-the-shelf interactive masks for key frames. To decouple the dynamic objects, we propose a new regularizer, removing dynamic parts with similar colors to the light sources. We demonstrate the effectiveness of ReHeaRF by showing robust performances on view synthesis under dynamic illumination conditions and outperforming state-of-the-art approaches in both quantitative and qualitative evaluations. We submit our source codes and video demo as supplementary materials.
Supplementary Material: zip
Primary Area: representation learning for computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 4596
Loading