Incorporate Dependency Relation Knowledge into Transformer Block for Multi-turn Dialogue GenerationDownload PDF

Anonymous

17 Dec 2021 (modified: 05 May 2023)ACL ARR 2021 December Blind SubmissionReaders: Everyone
Abstract: Because of the compositionality of natural language, syntactic structure is one of the key factors for semantic understanding. However, the Transformer block, which is widely used for obtaining the distributed representations of sentences in dialogue generation tasks, views sentences as a sequence of words and does not effectively learn the syntactic structure. In this work, we explore how to effectively incorporate dependency relation knowledge that contains syntactic structure information into Transformer block and propose Dependency Relation Attention(DRA). Experimental results demonstrate that DRA can further improve the performance of state-of-the-art models for multi-turn dialogue generation.
Paper Type: short
Consent To Share Data: yes
0 Replies

Loading