RefRef: A Dataset and Benchmark for Reconstructing Refractive and Reflective Objects

ICLR 2026 Conference Submission16268 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: 3D Reconstruction, Novel View Synthesis, Neural Radiance Fields, Refractive and Reflective Dataset
Abstract: Modern 3D reconstruction and novel view synthesis approaches have demonstrated strong performance on scenes with opaque, non-refractive objects. However, most assume straight light paths and therefore cannot properly handle refractive and reflective materials. Moreover, datasets specialized for these effects are limited, stymieing efforts to evaluate performance and develop suitable techniques. In this work, we introduce the RefRef dataset for reconstructing scenes with refractive and reflective objects from posed images. Our dataset has 50 synthetic objects of varying complexity, from single-material convex shapes to multi-material non-convex shapes, each placed in three different background types, resulting in 150 scenes. A real scene that mirrors the synthetic setup is also provided for comparison. We propose an oracle method that, given the object geometry and refractive indices, calculates accurate light paths for neural rendering, and an approach based on this that avoids these assumptions. We benchmark these against several state-of-the-art methods and show that all methods lag significantly behind the oracle, highlighting the challenges of the task and dataset.
Supplementary Material: zip
Primary Area: datasets and benchmarks
Submission Number: 16268
Loading