Abstract: Graph Neural Networks (GNNs) have recently become increasingly popular due to
their ability to learn complex systems of relations or interactions. These arise in a
broad spectrum of problems ranging from biology and particle physics to social
networks and recommendation systems. Despite the plethora of different models
for deep learning on graphs, few approaches have been proposed for dealing with
graphs that are dynamic in nature (e.g. evolving features or connectivity over time).
We present Temporal Graph Networks (TGNs), a generic, efficient framework for
deep learning on dynamic graphs represented as sequences of timed events. Thanks
to a novel combination of memory modules and graph-based operators, TGNs
significantly outperform previous approaches while being more computationally
efficient. We furthermore show that several previous models for learning on
dynamic graphs can be cast as specific instances of our framework. We perform a
detailed ablation study of different components of our framework and devise the
best configuration that achieves state-of-the-art performance on several transductive
and inductive prediction tasks for dynamic graphs.
0 Replies
Loading