Recursive Algorithmic Reasoning

Published: 18 Nov 2023, Last Modified: 29 Nov 2023LoG 2023 OralEveryoneRevisionsBibTeX
Keywords: graph neural networks, algorithmic reasoning
TL;DR: We propose a neural architecture that enables neural networks to reason about recursive problems, significantly improving out of distribution generalization performance over prior work in neural algorithmic reasoning on recursive algorithms.
Abstract: Learning models that execute algorithms can enable us to address a key problem in deep learning: generalizing to out-of-distribution data. However, neural networks are currently unable to execute recursive algorithms because they do not have arbitrarily large memory to store and recall state. To address this, we (1) propose a way to augment graph neural networks (GNNs) with a stack, and (2) develop an approach for sampling intermediate algorithm trajectories that improves alignment with recursive algorithms over previous methods. The stack allows the network to learn to store and recall a portion of the state of the network at a particular time, analogous to the action of a call stack in a recursive algorithm. This augmentation permits the network to reason recursively. We empirically demonstrate that our proposals significantly improve generalization to larger input graphs over prior work on depth-first search (DFS).
Submission Type: Full paper proceedings track submission (max 9 main pages).
Software: https://github.com/DJayalath/gnn-call-stack/
Poster: png
Poster Preview: png
Submission Number: 23
Loading