Clause Attention based on Signal Words DivisionDownload PDF

Anonymous

16 Nov 2021 (modified: 05 May 2023)ACL ARR 2021 November Blind SubmissionReaders: Everyone
Abstract: Clause Attention (CA) is very important for long sentences processing. We build and label datasets for signal word training. According to the position of signal word, the long sentences are divided into clauses which are assigned to additional block attention. The original sentence is mapped and fed into the shared encoder to learn the extraneous representation of words in its clause sentences. We use attention with prior to balance global attention with local attention. It improves the quality of long sentence processing in NER and NMT task.
0 Replies

Loading