Persistent Message PassingDownload PDF

Published: 01 Apr 2021, Last Modified: 05 May 2023GTRL 2021 PosterReaders: Everyone
Keywords: GNN, algorithms, algorithmic reasoning
TL;DR: A persistency mechanism for GNNs that allows queries on their past states.
Abstract: Graph neural networks (GNNs) are a powerful inductive bias for modelling algorithmic reasoning procedures and data structures. Their prowess was mainly demonstrated on tasks featuring Markovian dynamics, where querying any associated data structure depends only on its latest state. For many tasks of interest, however, it may be highly beneficial to support efficient data structure queries dependent on previous states. This requires tracking the data structure's evolution through time, placing significant pressure on the GNN's latent representations. We introduce Persistent Message Passing (PMP), a mechanism which endows GNNs with capability of querying past state by explicitly persisting it: rather than overwriting node representations, it creates new nodes whenever required. PMP generalises out-of-distribution to more than 2$\times$ larger test inputs on dynamic temporal range queries, significantly outperforming GNNs which overwrite states.
Poster: png
1 Reply

Loading