Abstract: While surrogate backpropagation proves useful for training deep spiking neural networks (SNNs), incorporating biologically inspired local signals on a large scale remains challenging. This difficulty stems primarily from the high memory demands of maintaining accurate spike-timing logs and the potential for purely local plasticity adjustments to clash with the supervised learning goal. To effectively leverage local signals derived from spiking neuron dynamics, we introduce Dopamine-Modulated Spike-Synchrony-Dependent Plasticity (DA-SSDP), a synchrony-based rule that is sensitive to loss and brings a synchrony-based local learning signal to the model. DA-SSDP condenses spike patterns into a synchrony metric at the batch level. An initial brief warm-up phase assesses its relationship to the task loss and sets a fixed gate that subsequently adjusts the local update's magnitude. In cases where synchrony proves unrelated to the task, the gate settles at one, simplifying DA-SSDP to a basic two-factor synchrony mechanism that delivers minor weight adjustments driven by concurrent spike firing and a Gaussian latency function. These small weight updates are only added to the network`s deeper layers following the backpropagation phase, and our tests showed this simplified version did not degrade performance and sometimes gave a small accuracy boost, serving as a regularizer during training. The rule stores only binary spike indicators and first-spike latencies with a Gaussian kernel. Without altering the model structure or optimization routine, evaluations on benchmarks like CIFAR-10 (+0.42\%), CIFAR-100 (+0.99\%), CIFAR10-DVS (+0.1\%), and ImageNet-1K (+0.73\%) demonstrated reliable accuracy gains, accompanied by a minor increase in computational overhead.
Submission Length: Long submission (more than 12 pages of main content)
Changes Since Last Submission: We are excited to submit the camera-ready version of our paper. In this version, we took the liberty of incorporating the editor's insightful suggestion regarding the statistical robustness of our claims.
Specifically, we have:
1. Softened the phrasing around statistical certainty (e.g., changing "reliability" to "consistency" or "variance estimation") to reflect our sample size better.
2. Added a limitation statement explicitly acknowledging that while our results are consistent across 5 runs, broader validation would be needed for strict statistical significance.
These are purely textual adjustments to ensure scientific rigor and do not alter the proposed method or experimental results. We believe these minor polishes make the final version more accurate and robust, aligning with the high standards of TMLR.
Code: https://github.com/NeuroSyd/DA-SSDP
Assigned Action Editor: ~Marlos_C._Machado1
Submission Number: 5985
Loading