Recurrent Color ConstancyDownload PDFOpen Website

2017 (modified: 10 Nov 2022)ICCV 2017Readers: Everyone
Abstract: We introduce a novel formulation of temporal color constancy which considers multiple frames preceding the frame for which illumination is estimated. We propose an end-to-end trainable recurrent color constancy network – the RCC-Net – which exploits convolutional LSTMs and a simulated sequence to learn compositional representations in space and time. We use a standard single frame color constancy benchmark, the SFU Gray Ball Dataset, which can be adapted to a temporal setting. Extensive experiments show that the proposed method consistently outperforms single-frame state-of-the-art methods and their temporal variants.
0 Replies

Loading