Continuous Spiking Graph ODE Networks

27 Sept 2024 (modified: 03 Dec 2024)ICLR 2025 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Spiking graph neural network; graph ODE
Abstract: Spiking Graph Networks (SGNs), as bio-inspired neural models that address energy consumption challenges for graph classification, have attracted considerable attention from researchers and the industry. However, SGNs are typically applied in static scenarios with real-valued inputs and cannot be directly utilized for dynamic prediction because of their limited capacity to handle dynamic real-valued features, denoted as architectural inapplicability. Moreover, they suffer from accuracy loss due to the inherently discrete nature of spike-based representations. Inspired by recent graph ordinary differential equation (ODE) methods, we propose the framework named \textbf{C}ontinuous \textbf{S}piking \textbf{G}raph \textbf{O}DE Networks (\method{}), which leverages the advantages of graph ODE to address the architectural inapplicability, and employs high-order structures to solve the problem of information loss. Specifically, \method{} replaces the high energy-consuming static SGNs with an efficient Graph ODE process by incorporating SGNs with graph ODE into a unified framework, thereby achieving energy efficiency. Then, we derive a high-order spike representation capable of preserving more information. By integrating this with a high-order graph ODE, we propose the second-order \method{} to address the information loss challenge. Furthermore, we prove that the second-order \method{} maintains stability during the dynamic graph learning. Extensive experiments validate the superiority of the proposed \method{} in performance while maintaining low power consumption.
Primary Area: other topics in machine learning (i.e., none of the above)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 9336
Loading