Generic Dependency Modeling in Multi-Party ConversationDownload PDF

Anonymous

16 Nov 2021 (modified: 05 May 2023)ACL ARR 2021 November Blind SubmissionReaders: Everyone
Abstract: Modeling the dependency between utterances in a multi-party conversation facilitates the understanding of conversation more precisely and holistically. In this paper, we propose a simple and generic framework for this purpose, in which the dependency is built on discourse parsing of utterances. Particularly, we present two approaches to encoding the dependency, namely absolute dependency encoding and relative dependency encoding, and combine them in Transformers by modifying the computation of self-attention. To enhance the understanding of utterance dependency, we further introduce a span distance prediction pre-training task for the proposed model. Experimental results on four multi-party conversation benchmarks for different tasks show that this model successfully boosts the generic performance of Transformer-based language models. Systematic studies are conducted to investigate why utterance dependencies are essential for multi-party conversation tasks and how they are learned in a simple and effective framework.
0 Replies

Loading