Keywords: Image Translation, Nighttime Image Rendering.
TL;DR: Translate images from nighttime to the daytime with physical prior.
Abstract: Night-to-Day translation (Night2Day) aims to achieve day-like nighttime vision. However, processing images with complex degradations in the night using unpaired data still remains a challenge in this field. Previous methods that uniformly mitigate these degradations have proven inadequate in simultaneously restoring daytime domain information and preserving underlying semantics. In this paper, we recognize the different degradation patterns in nighttime images and propose N2D3 (Night to Day via Degradation Disentanglement). It comprises a degradation disentanglement module and a degradation-aware contrastive learning module. Firstly, we extract physical priors from the photometric model derived from the Kubelka-Munk theory. Subsequently, under the guidance of physical priors, we design a disentanglement module to discriminate among different illumination degradation regions. Finally, we introduce degradation-aware contrastive learning to preserve semantic consistency across distinct degradation regions. Our method is evaluated on two public datasets, \textbf{with a significant improvement of 5.4 FID on BDD100K and 10.3 FID on Alderley
Primary Area: Machine vision
Flagged For Ethics Review: true
Submission Number: 301
Loading