Abstract: Semantic role labeling (SRL) aims at identifying the predicate-argument structure of a sentence. Recent work has significantly improved SRL performance by incorporating syntactic information and exploiting pre-trained models like BERT. Most of them use pre-trained models as isolated encoders to obtain word embeddings and enhance them with word-level syntax. Unlike many other languages, Chinese pre-trained models normally use Chinese characters instead of subwords as the basic input units, making the many-units-in-one-word phenomena more frequent and the relationship between characters more important. However, this character-level information is often ignored by previous research. In this paper, we propose the Character-Level Syntax-Infused network for Chinese SRL, which effectively incorporates the syntactic information between Chinese characters into pre-trained models. Experiments on the Chinese benchmarks of CoNLL-2009 and Universal Proposition Bank (UPB) show that the proposed approach achieves state-of-the-art results.
0 Replies
Loading