Keywords: Dynamic Graphs, Temporal Graph Learning, Graph Neural Networks, Online Learning
TL;DR: We propose a novel scheme for temporal graph learning that efficiently operates in high update rate scenarios.
Abstract: Modern approaches for learning on dynamic graphs have adopted the use of
batches instead of applying updates one by one. The use of batches allows these
techniques to become helpful in streaming scenarios where updates to graphs are
received at extreme speeds. Using batches, however, forces the models to update
infrequently, which results in the degradation of their performance. In this work,
we suggest a decoupling strategy that enables the models to update frequently
while using batches. By decoupling the core modules of temporal graph networks
and implementing them using a minimal number of learnable parameters, we have
developed the Lightweight Decoupled Temporal Graph Network (LDTGN), an exceptionally efficient model for learning on dynamic graphs. LDTG was validated
on various dynamic graph benchmarks, providing comparable or state-of-the-art
results with significantly higher throughput than previous art. Notably, our method
outperforms previous approaches by more than 20% on benchmarks that require
rapid model update rates, such as USLegis or UNTrade. The code to reproduce
our experiments is available at https://github.com/TPFI22/MODULES-DECOUPLING.
Format: Long paper, up to 8 pages. If the reviewers recommend it to be changed to a short paper, I would be willing to revise my paper to fit within 4 pages.
Submission Number: 14
Loading