Learning Dynamic Representations for Discourse Dependency Parsing

Published: 07 Oct 2023, Last Modified: 01 Dec 2023EMNLP 2023 FindingsEveryoneRevisionsBibTeX
Submission Type: Regular Long Paper
Submission Track: Discourse and Pragmatics
Keywords: Discourse dependency parsing, Transition systems, Dynamic sub-tree representations, Graph attention networks
TL;DR: We employ graph models to derive dynamic representations for transition states based on the sub-tree structures we obtain from the previous steps.
Abstract: Transition systems have been widely used for the discourse dependency parsing task. Existing works often characterize transition states by examining a certain number of elementary discourse units (EDUs), while neglecting the arcs obtained from the transition history. In this paper, we propose to employ GAT-based encoder to learn dynamic representations for sub-trees constructed in previous transition steps. By incorporating these representations, our model is able to retain accessibility to all parsed EDUs through the obtained arcs, thus better utilizing the structural information of the document, particularly when handling lengthy text spans with complex structures. For the discourse relation recognition task, we employ edge-featured GATs to derive better representations for EDU pairs. Experimental results show that our model can achieve state-of-the-art performance on widely adopted datasets including RST-DT, SciDTB and CDTB. Our code is available at $\href{https://github.com/lty-lty/Discourse-Dependency-Parsing}{https://github.com/lty-lty/Discourse-Dependency-Parsing}$.
Submission Number: 3205
Loading