Abstract: Spiking neural networks (SNNs) have the merit of energy efficiency, and have been widely used for various real-world applications. Similar to other types of neural networks, the performance of SNN is also significantly decided by its architecture. In this paper, we propose an Evolutionary Multi-objective Spiking Neural Architecture Search (EMO-SNAS) method that completely enables the automatic design of SNN architectures with both high performance and low power consumption. To achieve this, we first design a variable-length encoding strategy for SNNs, addressing the issue that traditional encoding strategies need to manually set the depth in advance. Furthermore, we propose an exploitation operator focusing on the local search for the variable-length encoding, as well as an exploration operator focusing on the global search based on the temporal expansion. Based on NSGA-II, EMO-SNAS can greatly balance the performance and power consumption during the architecture design. Experiments on three widely used image classification datasets show that EMO-SNAS can achieve the best among the state-of-the-art methods. Specifically, EMO-SNAS gains 0.45%, 0.26%, and 7.52% in terms of classification accuracy, yet significantly contributes to 33%, 29%, and 12% fewer spike numbers on CIFAR10, CIFAR100, and TinyImageNet datasets. Ablation studies show that the temporal expansion can improve the performance of EMO-SNAS. Moreover, the measurement of power consumption and theoretical convergence of EMO-SNAS are also discussed to justify its component design. In addition, with EMO-SNAS, the impact of initial channels for SNNs is also systemically investigated, based on which a conclusion against existing consensus is achieved. The source code is available at https://github.com/songxt3/EMO-SNAS.
External IDs:doi:10.1109/tevc.2025.3528471
Loading