Online graph netsDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: continuous-time dynamic graphs, temporal graph neural networks, graph neural networks
Abstract: Temporal graph neural networks (T-GNNs) sequentially update node states and use temporal message passing to predict events in continuous-time dynamic graphs. While node states rest in the memory, the message-passing operations must be computed on-demand for each prediction. In practice, these operations are the computational bottleneck of state-of-the-art T-GNNs as they require topologically exploring large temporal graphs. To circumvent this caveat, we propose Online Graph Nets (OGNs). To avoid temporal message passing, OGN maintains a summary of the temporal neighbors of each node in a latent variable and updates it as events unroll, in an online fashion. At prediction time, OGN simply combines node states and their latents to obtain node-level representations. Consequently, the memory cost of OGN is constant with respect to the number of previous events. Remarkably, OGN outperforms most existing T-GNNs on temporal link prediction benchmarks while running orders of magnitude faster. For instance, OGN performs similarly to the best-known T-GNN on Reddit, with a $374\times$ speedup. Also, since OGNs do not explore temporal graphs at prediction time, they are well-suited for on-device predictions (e.g., on mobile phones).
One-sentence Summary: The paper proposes Online Graph Nets (OGN), a fast and online streaming approach for representation learning on continuous-time dynamic graphs, which is competitive with state-of-the-art methods, while being orders of magnitude faster.
Supplementary Material: zip
13 Replies

Loading