Reducing Distant Supervision Noise with Maxpooled Attention and Sentence-Level SupervisionDownload PDF

Anonymous

06 Jun 2018 (modified: 06 Jun 2018)OpenReview Anonymous Preprint Blind SubmissionReaders: Everyone
Abstract: We propose an effective multitask learning setup for reducing distant supervision noise by leveraging sentence-level supervision. We show how sentence-level supervision can be used to improve the encoding of individual sentences, and to learn which input sentences are more likely to express the relationship between a pair of entities. We also introduce a novel neural architecture for collecting signals from multiple input sentences, which combines the benefits of attention and maxpooling. The proposed method increases AUC by 10% (from 0.261 to 0.284), and outperforms recently published results on the FB-NYT dataset.
Keywords: relation extraction, distant supervision, maxpooled attention, multitask learning
TL;DR: A new form of attention that works well for the distant supervision setting, and a multitask learning approach to add sentence-level annotations.
0 Replies

Loading