Exposure Completing for Temporally Consistent Neural High Dynamic Range Video Rendering

Published: 20 Jul 2024, Last Modified: 21 Jul 2024MM2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Abstract:

High dynamic range (HDR) video rendering from low dynamic range (LDR) videos where frames are of alternate exposure encounters significant challenges, due to the exposure change and absence at each time stamp. The exposure change and absence make existing methods generate flickering HDR results. In this paper, we propose a novel paradigm to render HDR frames via completing the absent exposure information, hence the exposure information is complete and consistent. Our approach involves interpolating neighbor LDR frames in the time dimension to reconstruct LDR frames for the absent exposures. Combining the interpolated and given LDR frames, the complete set of exposure information is available at each time stamp. This benefits the fusing process for HDR results, reducing noise and ghosting artifacts therefore improving temporal consistency. Extensive experimental evaluations on standard benchmarks demonstrate that our method achieves state-of-the-art performance, highlighting the importance of absent exposure completing in HDR video rendering. The code will be made publicly available upon the acceptance of this paper.

Primary Subject Area: [Experience] Multimedia Applications
Secondary Subject Area: [Experience] Multimedia Applications
Relevance To Conference: High dynamic range (HDR) videos provide audiences in the multimedia world with a more realistic and immersive experience due to their wider dynamic range. However, acquiring HDR videos still relies on expensive, specially designed equipment. The challenge of obtaining HDR videos on mobile devices such as smartphones needs to be addressed for widespread HDR application. Currently, researchers aim to render HDR videos from low dynamic range (LDR) frames with alternate exposures. This means that at any given moment, there's only one exposure, indicating exposure absent. Additionally, alternating exposure will results in noise and saturation issues appearing alternately in LDR frames. Furthermore, the motion between frames accompanied by noise and saturation presents significant challenges for HDR video rendering. Even the most advanced deep learning methods often produce visual results with obvious artifacts and noise, affecting the temporal consistency of HDR videos and the viewing experience. Therefore, we propose an exposure completion-based approach trying to address the fundamental problem of exposure absence in alternating exposure LDR sequences. By better reducing artifacts and noise, our method presents higher quality HDR videos with improved temporal consistency.
Supplementary Material: zip
Submission Number: 960
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview