Self-cross Feature based Spiking Neural Networks for Efficient Few-shot Learning

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: Deep spiking neural networks for efficient Few-shot Learning with self-cross features
Abstract: Deep neural networks (DNNs) excel in computer vision tasks, especially, few-shot learning (FSL), which is increasingly important for generalizing from limited examples. However, DNNs are computationally expensive with scalability issues in real world. Spiking Neural Networks (SNNs), with their event-driven nature and low energy consumption, are particularly efficient in processing sparse and dynamic data, though they still encounter difficulties in capturing complex spatiotemporal features and performing accurate cross-class comparisons. To further enhance the performance and efficiency of SNNs in few-shot learning, we propose a few-shot learning framework based on SNNs, which combines a self-feature extractor module and a cross-feature contrastive module to refine feature representation and reduce power consumption. We apply the combination of temporal efficient training loss and InfoNCE loss to optimize the temporal dynamics of spike trains and enhance the discriminative power. Experimental results show that the proposed FSL-SNN significantly improves the classification performance on the neuromorphic dataset N-Omniglot, and also achieves competitive performance to ANNs on static datasets such as CUB and miniImageNet with low power consumption.
Lay Summary: Teaching AI to learn from just a few examples—like recognizing a rare bird species from one or two photos—is extremely useful but also very hard. Most powerful AI models today need thousands of images and lots of energy to perform well. Our work uses a different kind of brain-inspired AI called Spiking Neural Networks (SNNs), which are more energy-efficient and mimic how real neurons work. We created a new system that helps SNNs learn better from small amounts of data by using two tricks: one that helps the model understand the details within each image, and another that helps it compare across different classes. We also improved the way the model learns over time, making it more accurate and robust, even when the data is noisy. Our method sets a new performance record for SNNs on a challenging dataset and performs nearly as well as traditional methods on popular benchmarks—while using much less energy. This brings us a step closer to smarter, low-power AI that can work in real-world settings like wearable devices, robots, or environmental monitors.
Primary Area: Applications->Neuroscience, Cognitive Science
Keywords: spiking neural networks, brain-inspired computing, few-shot learning
Submission Number: 8887
Loading