Leveraging Temporal Graph Networks Using Module Decoupling

Published: 16 Nov 2024, Last Modified: 26 Nov 2024LoG 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Temporal Graph Networks, Dynamic graphs, Graphs, Graph Neural Networks
TL;DR: This paper introduces a new challenge in dynamic graph learning and proposes a new lightweight architecture.
Abstract: Current memory-based methods for dynamic graph learning use batch processing to efficiently handle high stream of updates. However, the use of batches introduces a phenomenon we term $\textit{missing updates}$, which adversely affects the performance of memory-based models. In this work, we analyze the negative impacts of $\textit{missing updates}$ on dynamic graph learning models, and propose the decoupling strategy to mitigate these effects. Based on this strategy, we develop the Lightweight Decoupled Temporal Graph Network (LDTGN), a memory-based model with a minimal number of learnable parameters that deals with high frequency of updates. We validated our proposed model across diverse dynamic graph benchmarks. LDTGN surpassed the average precision of previous methods by over 20\% in scenarios demanding frequent graph updates. In the vast majority of the benchmarks, LDTGN achieves better or comparable results while operating with significantly higher throughput than existing baselines. The code to replicate our experiments is available at https://anonymous.4open.science/r/Modules-Decoupling-TGN-6FE7.
Submission Type: Full paper proceedings track submission (max 9 main pages).
Poster: png
Poster Preview: png
Submission Number: 82
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview