Centroiding Point-Objects With Event Cameras

Connor Hashemi, Dennis Melamed, Albert W. Reed, Nitesh Menon, Keigo Hirakawa, Scott McCloskey

Published: 01 Jan 2025, Last Modified: 11 Nov 2025IEEE Transactions on Pattern Analysis and Machine IntelligenceEveryoneRevisionsCC BY-SA 4.0
Abstract: Event-based sensors (EBS), with their low latency and high dynamic range, are a promising means for tracking unresolved point-objects. Conventional EBS centroiding methods assume the generated events follow a Gaussian distribution and require long event streams ($\gt 1$s) for accurate localization. However, these assumptions are inadequate for centroiding unresolved objects, since the EBS circuitry causes non-Gaussian event distributions, and because using long event streams negates the low-latency advantage of EBS. In this work, we derive a closed-form spatiotemporal event distribution that accounts for these non-Gaussian effects and relaxes the long-time window requirement. Using Fisher analysis, we show that the spatial distribution of events in short time windows ($\leq 20$ ms) contains sufficient information for accurately estimating both position and velocity. To validate our analysis, we create the first EBS dataset of unresolved point-objects with subpixel ground truth using a high-speed monitor. We demonstrate that a small LSTM network can estimate an object's position within 1pixel and velocity within $\pm 17\%$ using only 5ms of event data, outperforming traditional approaches. These improvements enable accurate and quick centroiding of fast and dim objects, and we publish all code and data to support future research.
Loading