Chinese Word Attention based on Valid Division of SentenceDownload PDF

Anonymous

16 Nov 2021 (modified: 05 May 2023)ACL ARR 2021 November Blind SubmissionReaders: Everyone
Abstract: Chinese word attention (CWA) with word-level information is very important for natural language processing. The purpose is how to attention words in a sentence. We first explore the valid divisions of a sentence by splitting word tools. We use BERT for character and word pre-training. Each character embedding with its word in one division is encoded in block local attention. We use attention with prior to assign attention weights to each splitting result, and finally combine the global attention mechanism to get the optimal recognition result in Chinese NER.
0 Replies

Loading