Abstract: Spiking Neural Networks (SNNs), serving as a nexus between neuroscience and machine learning, strive to emulate the intricacies of biological neurons. Their remarkable energy efficiency has garnered significant interest, propelling their adoption in real-world scenarios demanding stringent resource utilization. Advanced SNNs primarily utilize the widely adopted Leaky Integrate-and-Fire (LIF) neurons and are extensively applied across diverse domains. While SNNs have showcased image classification performance comparable to convolutional neural networks, a thorough investigation into their robustness remains absent in current research. Our work introduces adversarial event patch attack, which avoid adding and suppressing events in input spaces with high temporal and spatial resolution. We also propose a novel strategy named extended rate gradient approximation (ERGA), which accelerates the optimization of adversarial event patch and promotes better convergence of adversarial event patch by combining it into the optimization process of adversarial event patch. Our adversarial event patch attack achieves an average attack success rate of up to 68.41%. The Code is available at: https://github.com/yszbb/AE-Patch.
Loading