Simulating Action Dynamics with Neural Process Networks

Antoine Bosselut, Omer Levy, Ari Holtzman, Corin Ennis, Dieter Fox, Yejin Choi

Feb 15, 2018 (modified: Apr 29, 2018) ICLR 2018 Conference Blind Submission readers: everyone Show Bibtex
  • Abstract: Understanding procedural language requires anticipating the causal effects of actions, even when they are not explicitly stated. In this work, we introduce Neural Process Networks to understand procedural text through (neural) simulation of action dynamics. Our model complements existing memory architectures with dynamic entity tracking by explicitly modeling actions as state transformers. The model updates the states of the entities by executing learned action operators. Empirical results demonstrate that our proposed model can reason about the unstated causal effects of actions, allowing it to provide more accurate contextual information for understanding and generating procedural text, all while offering more interpretable internal representations than existing alternatives.
  • TL;DR: We propose a new recurrent memory architecture that can track common sense state changes of entities by simulating the causal effects of actions.
  • Keywords: representation learning, memory networks, state tracking