SiTNER: Improving Few-Shot Cross-lingual Nested Named Entity Recognition with high-quality pseudo-labelsDownload PDF

Anonymous

16 Oct 2023ACL ARR 2023 October Blind SubmissionReaders: Everyone
Abstract: Few-shot named entity recognition (NER) methods have shown preliminary effectiveness in flat tasks. However, existing methods still encounter difficulties when faced with cross-lingual and nested entity challenges due to the linguistic or nested structure gap. In this work, we propose a framework named SiTNER to deal with few-shot cross-lingual nested named entity recognition tasks. SiTNER mainly comprises two components: (1) contrastive span classification which could pull entities into corresponding prototype and generate high-quality pseudo-labels, and (2) masked pseudo data self-training which refine pseudo-labels and improves the span classification via self-training strategy. We train SiTNER on the English dataset and evaluate it on the English, German, and Russian datasets, and experimental results show our method could get comparable results.
Paper Type: long
Research Area: Efficient/Low-Resource Methods for NLP
Contribution Types: Approaches to low-resource settings
Languages Studied: English, German, Russian
Consent To Share Submission Details: On behalf of all authors, we agree to the terms above to share our submission details.
0 Replies

Loading