Learning Graphical State Transitions

Daniel D. Johnson

Oct 29, 2016 (modified: Mar 30, 2017) ICLR 2017 conference submission readers: everyone
  • Abstract: Graph-structured data is important in modeling relationships between multiple entities, and can be used to represent states of the world as well as many data structures. Li et al. (2016) describe a model known as a Gated Graph Sequence Neural Network (GGS-NN) that produces sequences from graph-structured input. In this work I introduce the Gated Graph Transformer Neural Network (GGT-NN), an extension of GGS-NNs that uses graph-structured data as an intermediate representation. The model can learn to construct and modify graphs in sophisticated ways based on textual input, and also to use the graphs to produce a variety of outputs. For example, the model successfully learns to solve almost all of the bAbI tasks (Weston et al., 2016), and also discovers the rules governing graphical formulations of a simple cellular automaton and a family of Turing machines.
  • TL;DR: I introduce a set of differentiable graph transformations, and use them to build a model with a graphical internal state that can extract structured data from text and use it to answer queries.
  • Keywords: Natural language processing, Deep learning, Supervised Learning, Structured prediction
  • Conflicts: cs.hmc.edu, hmc.edu

Loading