Lexicon Graph Adapter Based BERT Model for Chinese Named Entity Recognition

Published: 01 Jan 2024, Last Modified: 15 May 2025KSEM (5) 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Recently, both lexicon information and pre-trained models have been proved to be effective for Chinese named entity recognition task. Due to their respective advantages, lexical information and pre-trained models have been combined to improve the performance. However, existing methods only incorporate the initial lexicon features into the characters. The lexicon features are not updated during model training, and the relationships between adjacent words are ignored. We propose a novel Lexicon Graph Adapter based BERT Model for Chinese named entity recognition, namely LGA-BERT. This architecture constructs a heterogeneous graph to relate the characters and words, and updates and integrates the word features into the BERT layers through multiple layers of Word-Adapter and Character-Adapter. We perform extensive experiments on 4 widely used Chinese NER benchmark datasets. The result shows that our LGA-BERT achieves the state-of-the-art compared to the previous lexicon based methods.
Loading