DyGMAE: A Novel Dynamic Graph Masked Autoencoder for Link Prediction

Published: 07 May 2025, Last Modified: 13 Jun 2025UAI 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Dynamic Graph; Link Prediction; Graph Masked Autoencoder; Graph Learning
TL;DR: We propose a novel dynamic graph masked autoencoder (DyGMAE) for the dynamic link prediction task. It employs a multi-scale masking strategy to learn features and integrates contrastive learning to enhance performance.
Abstract: Dynamic link prediction (DLP) is a crucial task in graph learning, aiming to predict future links between nodes at subsequent time in dynamic graphs. Recently, graph masked autoencoders (GMAEs) have shown promising performance in self-supervised learning. However, their application to DLP is under-explored. Existing GMAEs struggle to capture temporal dependencies, and their random masking causes crucial information loss for DLP. Moreover, most existing DLP methods rely on local information, ignoring global information and failing to capture complex features in real-world dynamic graphs. To address these issues, we propose DyGMAE, a novel dynamic GMAE method specifically designed for DLP. DyGMAE introduces a Multi-Scale Masking Strategy (MSMS), which generates multiple graph views by masking parts of the edges and tries to reconstruct them. Additionally, a multi-scale masking representation alignment module with a contrastive learning objective is employed to align representations which are encoded by unmasked edges across these views. Through this design, different masked views can provide diverse information to alleviate the drawbacks of random masking, and contrastive learning can align different views to mitigate the problem of exploiting local and global information simultaneously. Experiments on benchmark datasets show DyGMAE achieves superior performance in the DLP task.
Latex Source Code: zip
Signed PMLR Licence Agreement: pdf
Readers: auai.org/UAI/2025/Conference, auai.org/UAI/2025/Conference/Area_Chairs, auai.org/UAI/2025/Conference/Reviewers, auai.org/UAI/2025/Conference/Submission175/Authors, auai.org/UAI/2025/Conference/Submission175/Reproducibility_Reviewers
Submission Number: 175
Loading