Abstract: Few-shot named entity recognition aims to identify specific words with the support of very few labeled entities. Existing transfer-learning-based methods learn the semantic features of words in the source domain and migrate them to the target domain but ignore the different label-specific information. We propose a novel Label-Attention Mechanism (LAM) to utilize the overlooked label-specific information. LAM can separate label information from semantic features and learn how to obtain label information from a few samples through the meta-learning strategy. When transferring to the target domain, LAM replaces the source label information with the knowledge extracted from the target domain, thus improving the migration ability of the model. We conducted extensive experiments on multiple datasets, including OntoNotes, CoNLL’03, WNUT’17, GUM, and Few-Nerd, with two experimental settings. The results show that LAM is 7% better than the state-of-the-art baseline models by the absolute F1 scores.
0 Replies
Loading