Discourse beyond Units: The Role of Context in Relation Recognition

ACL ARR 2024 December Submission2331 Authors

16 Dec 2024 (modified: 05 Feb 2025)ACL ARR 2024 December SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract:

Discourse frameworks have traditionally centered on minimal spans of ``discourse units'' or arguments, as defined by annotation schemas in frameworks like PDTB or RST. While discourse relations have been understood to not be viewed in full isolation, this approach may still be limiting, as annotators typically have access to the entire context when labeling spans and relations. In this study, we empirically evaluate the inclusion of contextual information in discourse modeling. Further, we also evaluate the effect of including explicit modeling of interactions between the spans. Our findings reveal that context-inclusive models outperform non-contextual baselines in case of explicit relations, with the inclusion of context proving more beneficial than explicit inter-argument modeling, but not beneficial in the case of implicit relations. We observe average improvements of 10.04% for PDTB3-L1, and 16.25% for L2. This work suggests that discourse units are not as minimal as previously assumed and contributes to a more nuanced understanding of discourse structure, opening new avenues for improving NLP for discourse comprehension.

Paper Type: Short
Research Area: Discourse and Pragmatics
Research Area Keywords: discourse relations, PDTB, penn discourse tree bank, rhetorical structure theory, contextual models
Contribution Types: Model analysis & interpretability, Data analysis
Languages Studied: English
Submission Number: 2331
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview