Incorporate Dependency Relation Knowledge into Transformer Block for Multi-turn Dialogue GenerationDownload PDF

Anonymous

04 Mar 2022 (modified: 05 May 2023)Submitted to NLP for ConvAIReaders: Everyone
Abstract: Because of the compositionality of natural language, syntactic structure is a key factor for semantic understanding in dialogue generation tasks. However, the widely adopted Transformer is hard to learn the compositionaity effectively, because the position embeddings contain less semantic relation information. To explicit model the compositionaity of language, we limit the information flow between words by constructing directed dependency relation graph and propose Dependency Relation Attention (DRA) to replace position embeddings. Experimental results demonstrate that DRA can further improve the performance of state-of-the-art models for multi-turn dialogue generation.
0 Replies

Loading