Keywords: Dynamic Networks, Representation Learning, NeuralODEs
Abstract: Representation learning on dynamic graphs has gained increasing attention due to its wide-ranging applications. A common approach is to update node embeddings incrementally, where each new event modifies a stored memory state in a jump-driven manner. For link prediction, however, such methods often end up modeling only short-term correlations between the most recent events and the next ones, rather than the underlying dynamics of the network as an interacting system. In this work, we instead investigate *history-free* dynamic embeddings through TGAE (Temporal Graph AutoEncoder). TGAE leverages Neural ODEs to represent nodes with time-varying embeddings defined by an initial condition and a global vector field parameterized by a neural network. Pairwise interactions are modeled as an inhomogeneous Poisson process, with rates determined by latent-space distances, enabling self-supervised training from observed events. Experiments on synthetic data and a high school contact-tracing dataset demonstrate that TGAE captures temporal patterns and extrapolates beyond the training horizon.
Submission Number: 55
Loading