Event-based Multi-range Radiance Separation and 3D Reconstruction via Line‑Scan Pseudo‑Square Illumination
Keywords: Computational Photography, Direct-global Separation, Event-based Vision, 3D Reconstruction
Abstract: Decomposing scene radiance into physically meaningful components, including direct reflection, interreflection, and scattering, enables a deeper understanding of scene appearance.
In this paper, we propose the first method to perform multi-range radiance component separation using only events captured by an event camera, without requiring any additional frame-based measurements.
Our approach scans the scene by swiping line-shaped illumination across it, while exploiting the event camera’s high temporal resolution and wide dynamic range to recover both direct and multiple global components corresponding to different light propagation distances.
To address the noise inherent in event-integration-based radiance recovery, we present a pixel-wise calibration strategy that leverages the reproducibility of per-pixel noise patterns.
We demonstrate that this calibration is highly effective in suppressing noise, enabling stable recovery from subtle signals.
Moreover, we show that by detecting the timing at which the scanning line passes each pixel, the same line-scan event data can be exploited for coarse 3D reconstruction.
Experimental results on real scenes show that our event-based approach achieves faster and finer component separation, while also enabling coarse depth estimation without the exposure control required by frame-based cameras.
Supplementary Material: zip
Submission Number: 318
Loading