External Illumination De-Interfering for Remote Photoplethysmography

16 Sept 2025 (modified: 14 Nov 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: remote photoplethysmography, computer vision, deep learning
TL;DR: remote photoplethysmography
Abstract: Remote photoplethysmography (rPPG), a non-contact technique for extracting physiological signals from facial videos, has drawn increasing interest in the AI community. However, most existing approaches are tailored for idealized studio lighting and struggle in complex real-world scenarios. While some studies attempt to mitigate illumination interference by referencing subject-background features, heterogeneous lighting on the face often violates their underlying assumptions, limiting further performance gains. To address these issues, we propose a novel rPPG framework to counteract the adverse effects of complex external illumination on biosignal perception. Considering the unknown and dynamic nature of illumination distributions and their influence on facial imaging variations, we introduce relative total variation to disentangle global illumination components and preserve high-frequency biosignal transients, while compressing subtle temporal clues within video sequences. This operation enables a contrastive strategy to model facial illumination representations. The captured illumination distribution is then self-supervisedly separated from the original input to yield purified rPPG features. We further incorporate a frequency feedforward-guided Transformer to exploit the quasi-periodic nature of pulse waveforms for vital sign estimation. Extensive experiments on multiple public datasets under diverse lighting and motion conditions demonstrate that our method achieves state-of-the-art performance.
Primary Area: applications to computer vision, audio, language, and other modalities
Submission Number: 6464
Loading