Neural-CAT-SNN: Spike-Driven Neural Decoder with Multi-scale Temporal Dynamic and Reparameterized Linear Attention

Yuqi Yang, Tengjun Liu, Haiyan Zhang, Nenggan Zheng, Shaomin Zhang

Published: 01 Jan 2026, Last Modified: 25 Jan 2026CrossrefEveryoneRevisionsCC BY-SA 4.0
Abstract: Fully implantable brain-machine interfaces (BMIs) require neural decoders that deliver high accuracy under stringent energy constraints. Although spiking neural networks (SNNs) present a promising and energy-efficient alternative to traditional artificial neural networks (ANNs), existing SNN-based cortical spike train (CST) decoders often exhibit limited performance due to suboptimal architectural designs. In this work, we propose neural conv-attention multiscale temporal SNN (Neural-CAT-SNN), a novel framework optimized for high-accuracy and energy-efficient CST decoding with: i) a recurrent dynamic synapse mechanism that enables multiscale temporal feature integration by leveraging heterogeneous membrane and synaptic dynamics alongside recurrent connections. ii) A RepConv channel attention block, which fuses reparameterized convolutions with spike-driven linear attention to enhance spatial feature extraction. iii) Reparameterized modules incorporate multi-branch convolutions and batch normalization during training and are merged into compact inference operations. Evaluated on non-human primate motor prediction task, Neural-CAT-SNN achieves accuracy with an average R\(^2\) of 0.688, surpassing state-of-the-art SNNs while utilizing fewer parameters and energy cost, and matches the accuracy of leading ANNs. By maintaining fully accumulate operation between synapses, Neural-CAT-SNN demonstrates exceptional energy efficiency-an essential feature for fully implantable BMI applications. Collectively, these advances position Neural-CAT-SNN as a high-performance, low-power solution for next-generation BMI decoders.
Loading