TL;DR: This work introduces Delay-DSGN, a dynamic spiking graph neural network with a learnable delay mechanism, to accurately capture temporal dependencies and mitigate information forgetting in dynamic graph representation learning.
Abstract: Dynamic graph representation learning using Spiking Neural Networks (SNNs) exploits the temporal spiking behavior of neurons, offering advantages in capturing the temporal evolution and sparsity of dynamic graphs. However, existing SNN-based methods often fail to effectively capture the impact of latency in information propagation on node representations. To address this, we propose Delay-DSGN, a dynamic spiking graph neural network incorporating a learnable delay mechanism. By leveraging synaptic plasticity, the model dynamically adjusts connection weights and propagation speeds, enhancing temporal correlations and enabling historical data to influence future representations. Specifically, we introduce a Gaussian delay kernel into the neighborhood aggregation process at each time step, adaptively delaying historical information to future time steps and mitigating information forgetting. Experiments on three large-scale dynamic graph datasets demonstrate that Delay-DSGN outperforms eight state-of-the-art methods, achieving the best results in node classification tasks. We also theoretically derive the constraint conditions between the Gaussian kernel's standard deviation and size, ensuring stable training and preventing gradient explosion and vanishing issues.
Lay Summary: In the real world, much of the data can be represented as "graphs," such as social networks and transportation networks. These graphs change over time, and we refer to them as "dynamic graphs." To understand and utilize these dynamic graphs, researchers have proposed "dynamic graph representation learning," a method that converts each node in the graph (such as a person or a location) into a numerical vector that computers can process. In recent years, a method called "Spiking Neural Networks (SNNs)" has been applied to dynamic graph representation learning. It simulates the working mechanism of biological neurons and is particularly good at processing time-dependent data, making it highly suitable for dynamic graphs that evolve over time. However, existing methods often overlook an important factor during information propagation — "delay." That is, it takes time for information to travel from one node to another, and this delay affects how nodes are represented. To address this issue, we propose a new model called Delay-DSGN. This model incorporates a learnable delay mechanism that automatically adjusts the speed and strength of information propagation, allowing it to better capture how information evolves over time. As a result, our model can more accurately understand how dynamic graphs change over time and is applicable to various scenarios that require processing temporal graph structures.
Primary Area: Deep Learning->Graph Neural Networks
Keywords: Graph Neural Networks, Dynamic Graph, Spiking Neural Network, Graph Representation Learning
Submission Number: 14657
Loading