Improving Distantly-Supervised Relation Extraction through BERT-based Label & Instance EmbeddingsDownload PDF

Anonymous

01 Jun 2020 (modified: 01 Jun 2020)OpenReview Anonymous Preprint Blind SubmissionReaders: Everyone
Keywords: relation extraction, distant supervision, transformers, embeddings, attention
Abstract: Distantly-supervised relation extraction (RE) is an effective method to scale RE to large corpora but suffers from noisy labels. Existing approaches try to alleviate noise through multi-instance learning and by providing additional information, but they treat labels as independent and seldom consider the relationship between labels and entities. Consequently, potentially valuable information goes unused. We propose REDSandT (Relation Extraction with Distant Supervision and Transformers), a novel distantly-supervised transformer-based RE method that manages to capture highly informative instance and label embeddings for RE by exploiting BERT's pre-trained model. We guide REDSandT to focus solely on relational tokens by fine-tuning BERT on a structured input, including the sub-tree connecting an entity pair and the entities' types. Using the extracted informative vectors, we shape label embeddings, which we also use as attention mechanism over instances to further reduce noise. Finally, we represent sentences by concatenating relation and instance embeddings. Experiments in the NYT-10 dataset show that REDSandT captures a broader set of relations with higher confidence, achieving state-of-the-art AUC (0.424).
0 Replies

Loading