Abstract: With the continuous development of deep learning technology, the field of Named Entity Recognition(NER) has made great achievements in recent years. In Chinese NER, making full use of word information is becoming the key to improve model performance. In the previous related work, lexicon was applied to add word information. However, the word vectors generated by that way is static. It means that it cannot accurately describe some polysemous words in a specific context, which will affect the performance of the NER task. This paper presents EiCi to solve this problem. The new method is proposed that, without relying on external pre-trained word vectors, it takes the advantage of the pre-trained language model BERT to extract polysemous word information. In order to further utilize the word information, a sub-module for type recognition is also added to assist the main task of NER. Experiments on two main Chinese NER datasets show EiCi has better performance than the traditional NER models and other NER models that use word information.
Paper Type: long
0 Replies
Loading