Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks

Yikang Shen, Shawn Tan, Alessandro Sordoni, Aaron Courville

Sep 27, 2018 ICLR 2019 Conference Blind Submission readers: everyone Show Bibtex
  • Abstract: In general, natural language is governed by a tree structure: smaller units (e.g., phrases) are nested within larger units (e.g., clauses). This is a strict hierarchy: when a larger constituent ends, all of the smaller constituents that are nested within it must also be closed. While the standard LSTM allows different neurons to track information at different time scales, the architecture does not impose a strict hierarchy. This paper proposes to add such a constraint to the system by ordering the neurons; a vector of "master" input and forget gates ensure that when a given unit is updated, all of the units that follow it in the ordering are also updated. To this end, we propose a new RNN unit: ON-LSTM, which achieves good performance on four different tasks: language modeling, unsupervised parsing, targeted syntactic evaluation, and logical inference.
  • Keywords: Deep Learning, Natural Language Processing, Recurrent Neural Networks, Language Modeling
  • TL;DR: We introduce a new inductive bias that integrates tree structures in recurrent neural networks.
0 Replies