EV-Catcher: High-Speed Object Catching Using Low-Latency Event-Based Neural NetworksDownload PDFOpen Website

Published: 01 Jan 2022, Last Modified: 17 May 2023IEEE Robotics Autom. Lett. 2022Readers: Everyone
Abstract: Event-based sensors have recently drawn increasing interest in robotic perception due to their lower latency, higher dynamic range, and lower bandwidth requirements compared to standard CMOS-based imagers. These properties make them ideal tools for real-time perception tasks in highly dynamic environments. In this work, we demonstrate an application where event cameras excel: accurately estimating the impact location of fast-moving objects. We introduce a lightweight event representation called <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">Binary Event History Image</i> (BEHI) to encode event data at low latency, as well as a learning-based approach that allows real-time inference of a confidence-enabled control signal to the robot. To validate our approach, we present an experimental catching system in which we catch fast-flying ping-pong balls. We show that the system is capable of achieving a success rate of 81% in catching balls targeted at different locations, with a velocity of up to <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><tex-math notation="LaTeX">$13\,\text{m/s}$</tex-math></inline-formula> even on compute-constrained embedded platforms such as the Nvidia Jetson NX.
0 Replies

Loading