Abstract: Few-shot relation classification (RC) is a crucial task that aims to identify the relationships between entity pairs with limited mentions. However, this task is challenging due to the insufficient amount of annotated data. Augmenting the quantity of annotated instances provides a reasonable solution, which is typically realized through distant supervision (DS). However, this approach may introduce substantial noise and negatively impact model performance. Conversely, prompt learning (PL) improves the quality of annotated instances by enabling the models to understand the input text thoroughly with templates. Unfortunately, these templates mainly neglect the semantics of labels. To address these issues, we propose coarse-to-fine filters with concept heuristic prompt (CoFF-CHP), which employs dual-stage filters to select and label reliable instances from unlabeled corpus. Specifically, we first employ the coarse-grained filters to preliminarily select the instances. Next, we design a fine-grained filter, concept heuristic prompt (CHP), which can integrate the semantics of the label concept to help the model understand the link between the input text and labels. Finally, to reduce the tilt toward positive instances, we incorporate the screen-out data and false-positive instances to form the negative set to correct the classifier. The experimental results on the public Fewrel dataset indicate that CoFF-CHP achieves state-of-the-art performance in low-resource scenarios, especially when the best baseline is upgraded by 15.06% in terms of F1 in the setting of 10 seed instances.
External IDs:dblp:journals/apin/LiHZSWC24
Loading