Dynamic Eyebox Steering for Improved Pinlight AR Near-Eye Displays

Xinxing Xia, Zheye Yu, Dongyu Qiu, Andrei State, Tat-Jen Cham, Frank Guan, Henry Fuchs

Published: 2025, Last Modified: 27 Feb 2026IEEE Trans. Vis. Comput. Graph. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: An optical-see-through near-eye display (NED) for augmented reality (AR) allows the user to perceive virtual and real imagery simultaneously. Existing technologies for optical-see-through AR NEDs involve trade-offs between key metrics such as field of view (FOV), eyebox size, form factor, etc. We have enhanced an existing compact wide-FOV pinlight AR NED design with real-time 3D pupil localization in order to dynamically steer and thus effectively enlarge the usable eyebox. This is achieved with a dual-camera rig that captures stereoscopic views of the pupils. The 3D pupil location is used to dynamically calculate a display pattern that spatio-temporally modulates the light entering the wearer's eyes. We have built a demonstrable compact prototype and have conducted a user study that indicates the effectiveness of our eyebox steering method (e.g., without eyebox steering, in 10.5% of our tests, users were unable to perceive the test pattern correctly before experiment timeout; with eyebox steering, that fraction decreased dramatically to 1.25%). This is a small yet crucial step in making simple wide-FOV pinlight NEDs usable for human users and not just as demonstration prototypes filmed with a precisely positioned camera standing in for the user's eye. Further contributions of this paper include a detailed description of display design, calibration technique, and user study design, all of which may benefit other NED research.
Loading