Modeling Compositionality with Dependency Graph for Dialogue GenerationDownload PDFOpen Website

14 Dec 2023 (modified: 14 Dec 2023)OpenReview Archive Direct UploadReaders: Everyone
Abstract: Because of the compositionality of natural language, syntactic structure which contains the information about the relationship between words is a key factor for semantic understanding. However, the widely adopted Transformer is hard to learn the syntactic structure effectively in dialogue generation tasks. To explicitly model the compositionaity of language in Transformer Block, we restrict the information flow between words by constructing directed dependency graph and propose Dependency Relation Attention (DRA). Experimental results demonstrate that DRA can further improve the performance of state-of-the-art models for dialogue generation.
0 Replies

Loading