Combining Graph and Recurrent Networks for Efficient and Effective Segment TaggingDownload PDF

Published: 24 Nov 2022, Last Modified: 05 May 2023LoG 2022 PosterReaders: Everyone
Keywords: Entity tagging, Graph Neural Networks, Information extraction
TL;DR: Extremely light entity tagging model combining Transformers for text feature extraction, and Graph Neural Networks and recurrent layers for segments interaction
Abstract: Graph Neural Networks have been demonstrated to be highly effective and efficient in learning relationships between nodes locally and globally. Also, they are suitable for documents-related tasks due to their flexibility and capacity of adapting to complex layouts. However, information extraction on documents still remains a challenge, especially when dealing with unstructured documents. The semantic tagging of the text segments (a.k.a. entity tagging) is one of the essential tasks. In this paper we present SeqGraph, a new model that combines Transformers for text feature extraction, and Graph Neural Networks and recurrent layers for segments interaction, for an efficient and effective segment tagging. We address some of the limitations of current architectures and Transformer-based solutions. We optimize the model architecture by combining Graph Attention layers (GAT) and Gated Recurrent Units (GRUs), and we provide an ablation study on the design choices to demonstrate the effectiveness of SeqGraph. The proposed model is extremely light (4 million parameters), reducing the number of parameters between 100- and 200-times compared to its competitors, while achieving state-of-the-art results (97.23% F1 score on CORD dataset).
Type Of Submission: Full paper proceedings track submission (max 9 main pages).
PDF File: pdf
Agreement: Check this if you are okay with being contacted to participate in an anonymous survey.
Type Of Submission: Full paper proceedings track submission.
Poster: png
Poster Preview: png
6 Replies

Loading