Learning Graphical State TransitionsDownload PDF

Published: 21 Jul 2022, Last Modified: 05 May 2023ICLR 2017 OralReaders: Everyone
Abstract: Graph-structured data is important in modeling relationships between multiple entities, and can be used to represent states of the world as well as many data structures. Li et al. (2016) describe a model known as a Gated Graph Sequence Neural Network (GGS-NN) that produces sequences from graph-structured input. In this work I introduce the Gated Graph Transformer Neural Network (GGT-NN), an extension of GGS-NNs that uses graph-structured data as an intermediate representation. The model can learn to construct and modify graphs in sophisticated ways based on textual input, and also to use the graphs to produce a variety of outputs. For example, the model successfully learns to solve almost all of the bAbI tasks (Weston et al., 2016), and also discovers the rules governing graphical formulations of a simple cellular automaton and a family of Turing machines.
TL;DR: I introduce a set of differentiable graph transformations, and use them to build a model with a graphical internal state that can extract structured data from text and use it to answer queries.
Conflicts: cs.hmc.edu, hmc.edu
Keywords: Natural language processing, Deep learning, Supervised Learning, Structured prediction
19 Replies

Loading