Collaborative On-Sensor Array Cameras

Published: 14 Jul 2025, Last Modified: 22 Aug 2025OpenReview Archive Direct UploadEveryoneCC BY 4.0
Abstract: Modern nanofabrication techniques have enabled us to manipulate the wave-front of light with sub-wavelength-scale structures, offering the potential to replace bulky refractive surfaces in conventional optics with ultrathin metasurfaces. In theory, arrays of nanoposts provide unprecedented control over manipulating the wavefront in terms of phase, polarization, and amplitude at the nanometer resolution. A line of recent work successfully investigates flat computational cameras that replace compound lenses with a single metalens or an array of metasurfaces a few millimeters from the sensor. However, due to the inherent wavelength dependence of metalenses, in practice, these cameras do not match their refractive counterparts in image quality for broadband imaging, and may even suffer from hallucinations when relying on generative reconstruction methods. In this work, we investigate a collaborative array of metasurface elements that are jointly learned to perform broadband imaging. To this end, we learn a nanophotonics array with 100-million nanoposts that is end-to-end jointly optimized over the full visible spectrum—a design task that existing inverse design methods or learning approaches cannot support due to memory and compute limitations. We introduce a distributed meta-optics learning method to tackle this challenge. This allows us to optimize a large parameter array along with a learned metaatom proxy and a non-generative reconstruction method that is parallax-aware and noise-aware. The proposed camera performs favorably in simulation and in all experimental tests irrespective of the scene illumination spectrum.
Loading