Relation Extraction with Weighted Contrastive Pre-training on Distant SupervisionDownload PDF

Anonymous

16 Feb 2022 (modified: 05 May 2023)ACL ARR 2022 February Blind SubmissionReaders: Everyone
Abstract: Contrastive pre-training on distant supervision has shown remarkable effectiveness for improving supervised relation extraction tasks. However, the existing methods ignore the intrinsic noise of distant supervision during the pre-training stage. In this paper, we propose a weighted contrastive learning method by leveraging the supervised data to estimate the reliability of pre-training instances and explicitly reduce the effect of noise. Experimental results on three supervised datasets demonstrate the advantages of our proposed weighted contrastive learning approach, compared to two state-of-the-art non-weighted baselines.
Paper Type: short
0 Replies

Loading