StructFormer: Joint Unsupervised Induction of Dependency and Constituency Structure from Masked Language ModelingDownload PDF

28 Sept 2020 (modified: 05 May 2023)ICLR 2021 Conference Blind SubmissionReaders: Everyone
Keywords: Unsupervised Dependency Parsing, Unsupervised Constituency Parsing, Masked Language Model
Abstract: There are two major classes of natural language grammars --- the dependency grammar that models one-to-one correspondences between words and the constituency grammar that models the assembly of one or several corresponded words. While previous unsupervised parsing methods mostly focus on only inducing one class of grammars, we introduce a novel model, StructFormer, that can induce dependency and constituency structure at the same time. To achieve this, we propose a new parsing framework that can jointly generates constituency tree and dependency graph. Then we integrate the induced dependency relations into transformer, in a differentiable manner, through a novel dependency-constrained self-attention mechanism. Experimental results show that our model can achieve strong results on unsupervised constituency parsing, unsupervised dependency parsing and masked language modeling at the same time.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
One-sentence Summary: We propose a novel neural network based model that can do unsupervised dependency and constituency parsing at the same time.
Reviewed Version (pdf): https://openreview.net/references/pdf?id=lGlD1BCrNo
6 Replies

Loading