Wavefront Neural Radiance Fields for Multi-depth Reconstruction

Published: 01 Jan 2024, Last Modified: 06 Mar 2025ICPR (18) 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: This paper proposes a novel NeRF (Neural Radiance Fields), called WF-NeRF, for accurately recovering wavefront signals with multiple depth values. Although range sensors such as LiDARs typically return depth as scalars, the proposed method uses 1D raw signals of LiDARs, i.e., all reflected signals of multiple objects in the path of a beam. Due to this, coarse sampling used in the conventional NeRFs is no longer required but only a single-pass sampling, thus improving learning and memory efficiency. Considering the property of LiDAR signals, where the signal intensity decays inversely proportional to the square of the distance as the beam light spreads over the wavefront, we introduce a new sampling strategy of the same distance signals on the wavefront and a loss function taking the relative error between the training and predicted values (MSRE: Mean Squared Relative Error). The wavefront sampling produces super-resolution-like effects and improves the accuracy of multiple-depth estimation. MSRE normalizes the decay of the observed signals and stabilizes the learning process. In experiments with an object occluded by a mesh, we show that the conventional NeRFs fail to reconstruct the 3D shape. On the other hand, the proposed WF-NeRF accurately recovers both the mesh and the object, even with a smaller number of input data.
Loading