CSSGT: Contrastive learning-based Split Spiking Graph Transformer
Keywords: Graph Neural Networks, Spiking Neural Networks, Transformers, Mutual Information, Graph Contrastive Learning.
Abstract: Although the integration of Graph Neural Networks (GNNs) and Transformers has demonstrated promising performance across various graph tasks, it remains computationally expensive. In contrast, brain-inspired Spiking Neural Networks (SNNs) offer an energy-efficient architecture due to their unique spike-based, event-driven paradigm. In this paper, we propose a novel framework CSSGT, which leverages both the strength of Transformers and the computational efficiency of SNNs for graph tasks, trained under the graph contrastive learning framework. CSSGT comprises two key components: Mutual Information-based Graph Split (MIGS) and Spike-Driven Graph Attention (SDGA). MIGS is designed for the sequential input of SNNs, splitting the graph while maximizing mutual information and minimizing redundancy. SDGA, tailored for graph data, exploits sparse graph convolution and addition operations, achieving low computational energy consumption. Extensive experiments on diverse datasets demonstrate that CSSGT converges within two epochs and outperforms various state-of-the-art models while maintaining low computational cost.
Primary Area: learning on graphs and other geometries & topologies
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5904
Loading