HiTCO: High-Fidelity Memory and Combined-Objective Training for Dynamic Link Prediction

18 Sept 2025 (modified: 28 Nov 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Graph Neural Networks,Temporal Graphs,Link Prediction
Abstract: Temporal graph link prediction is a critical task, yet existing methods face a trade-off between the computational efficiency of memory-based models and the expressive power of graph-based approaches. This contradiction is sharply highlighted by the introduction of large-scale, realistic benchmarks like the Temporal Graph Benchmark (TGB), which reveals the scalability limitations of many expressive models. To address this challenge, we introduce HiTCO (High-Fidelity Temporal Representation with Combined-Objective Training), a novel architecture that bridges this gap. HiTCO integrates a high-fidelity memory module, featuring Gated Recurrent Unit (GRU) based updates with principled gradient management, and a deep Multi-Layer Perceptron (MLP) prediction head that learns complex, non-linear interaction patterns between nodes. The model is trained with a novel combined-objective loss function that synergistically optimizes for both ranking accuracy and classification confidence, directly aligning the training process with modern evaluation protocols. Extensive experiments demonstrate that HiTCO achieves new state-of-the-art performance on several challenging datasets from the TGB suite, significantly outperforming a wide range of memory-based, graph-based, and hybrid baselines. Our work shows that a carefully designed, memory-centric architecture pairing with a powerful predictive head and a tailored training objective, can achieve superior expressiveness without sacrificing the scalability essential for real-world temporal graphs.
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 10383
Loading