Primary Area: general machine learning (i.e., none of the above)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: SNN, Knowledge Distillation, object detection, event data
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: A new Spiking Neural Network object detector for a challenging automotive event dataset which is further improved by knowledge distillation
Abstract: In the era of AI at the edge, self-driving cars, and climate change, the need for energy-efficient, small, embedded AI is growing.
Spiking Neural Networks (SNNs) are a promising approach to address this challenge, with their event-driven information flow and sparse activations.
We propose Spiking CenterNet for object detection on event data.
It combines an SNN CenterNet adaptation with an efficient M2U-Net-based decoder.
Our model significantly outperforms comparable previous work on Prophesee's challenging GEN1 Automotive Detection Dataset while using less than half the energy.
Distilling the knowledge of a non-spiking teacher into our SNN further increases performance.
To the best of our knowledge, our work is the first approach that takes advantage of knowledge distillation in the field of spiking object detection.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3426
Loading