SmartSpanNER: Making SpanNER Robust in Low Resource Scenarios

Published: 07 Oct 2023, Last Modified: 01 Dec 2023EMNLP 2023 FindingsEveryoneRevisionsBibTeX
Submission Type: Regular Long Paper
Submission Track: Syntax, Parsing and their Applications
Submission Track 2: Information Extraction
Keywords: SpanNER, Named Entity Head, SmartSpanNER, Multi-task Learning
TL;DR: This paper proposes SmartSpanNER, a multi-task learning approach that introduces the Named Entity Head prediction task into SpanNER, to address the sensitivity of SpanNER to the amount of training data.
Abstract: Named Entity Recognition (NER) is one of the most fundamental tasks in natural language processing. Span-level prediction (SpanNER) is more naturally suitable for nested NER than sequence labeling (SeqLab). However, according to our experiments, the SpanNER method is more sensitive to the amount of training data, i.e., the F1 score of SpanNER drops much more than that of SeqLab when the amount of training data drops. In order to improve the robustness of SpanNER in low resource scenarios, we propose a simple and effective method SmartSpanNER, which introduces a Named Entity Head (NEH) prediction task to SpanNER and performs multi-task learning together with the task of span classification. Experimental results demonstrate that the robustness of SpanNER could be greatly improved by SmartSpanNER in low resource scenarios constructed on the CoNLL03, Few-NERD, GENIA and ACE05 standard benchmark datasets.
Submission Number: 478
Loading