ITER: Iterative Transformer-based Entity Recognition and Relation ExtractionDownload PDF

Anonymous

16 Dec 2023ACL ARR 2023 December Blind SubmissionReaders: Everyone
TL;DR: In this work, we propose ITER, a functionally more expressive, non-autoregressive model, that unifies several improvements to a recent language modeling approach
Abstract: Entity Recognition and Relation Extraction are essential components in extracting structured information from text. Recent advances for both tasks generate a structured representation of the information in an autoregressive fashion, a time-intensive and computationally expensive approach. This raises the natural question whether autoregressive methods are necessary in order to achieve comparable results. In this work, we propose ITER, a functionally more expressive, non-autoregressive model, that unifies several improvements to a recent language modeling approach: ITER improves inference throughput by up to 23x, is capable of handling nested entities and effectively halves the number of required parameters in comparison. Furthermore, we achieve a SOTA result of 84.30 F1 for the relation extraction dataset ADE and demonstrate competitive performances for both named entity recognition with GENIA and CoNLL03 as well as for relation extraction with CoNLL04 and NYT.
Paper Type: long
Research Area: Information Extraction
Contribution Types: NLP engineering experiment
Languages Studied: English
0 Replies

Loading