Real-time Classification from Short Event-Camera Streams using Input-filtering Neural ODEsDownload PDF

Anonymous

23 Oct 2020 (modified: 05 May 2023)Submitted to NeurIPS 2020 Deep Inverse WorkshopReaders: Everyone
Keywords: NeuralODE, Filtering, Event-based
Abstract: Event-based cameras are novel, efficient sensors inspired by the human vision system, generating an asynchronous, pixel-wise stream of data. Learning from such data is generally performed through event integration into images. This requires buffering long sequences and can limit the response time of the inference system. In this work, we propose to directly use events from a DVS camera, which produces a stream of intensity changes and their spatial coordinates. This sequence is used as an input for a novel asynchronous RNN-like architecture, the Input- filtering Neural ODE (INODE). INODE allows for input signals to be continuously fed to the network, as done for filtering dynamical systems. INODE learns to discriminate short event sequences and to perform event-by-event online inference. We demonstrate our approach on a series of classification tasks, comparing against a set of LSTM baselines. We show that, independently of the camera resolution, INODE can outperform the baselines by a large margin on the ASL task and it is on par with a considerably larger LSTM for the NCALTECH task. Finally, we show that INODE is accurate even when provided with very few events.
0 Replies

Loading