Semi-Supervised Semantic Dependency Parsing Using CRF Autoencoders

Sep 25, 2019 ICLR 2020 Conference Withdrawn Submission readers: everyone
  • TL;DR: We propose an approach to semi-supervised learning of semantic dependency parsers based on the CRF autoencoder framework.
  • Abstract: Semantic dependency parsing, which aims to find rich bi-lexical relationships, allows words to have multiple dependency heads, resulting in graph-structured representations. We propose an approach to semi-supervised learning of semantic dependency parsers based on the CRF autoencoder framework. Our encoder is a discriminative neural semantic dependency parser that predicts the latent parse graph of the input sentence. Our decoder is a generative neural model that reconstructs the input sentence conditioned on the latent parse graph. Our model is arc-factored and therefore parsing and learning are both tractable. Experiments show our model achieves significant and consistent improvement over the supervised baseline.
  • Keywords: Semi-Supervised Learning, Semantic Dependency Parsing, CRF Autoencoder, Natural Language Processing
0 Replies

Loading