Squaring the Circle: More Generalizable Dialogue Discourse Parsing with Less SupervisionDownload PDF

Anonymous

16 Oct 2023ACL ARR 2023 October Blind SubmissionReaders: Everyone
Abstract: Discourse analysis plays a crucial role in Natural Language Processing, with discourse relation prediction arguably being the most difficult task in discourse parsing. Previous studies have generally focused on explicit or implicit discourse relation classification in monologues, leaving dialogue an under-explored domain. Facing the data scarcity issue, we propose to leverage self-training strategies based on Transformer backbone. Moreover, we design the first semi-supervised full discourse parsing pipeline that sequentially conducts parsing tasks. Using only 50 examples as gold training data, our relation prediction module achieves 58.4 in accuracy on the STAC corpus, close to supervised state-of-the-art. Full parsing results show notable improvements compared to the supervised models both in-domain (gaming) and cross-domain (technical chat), with better stability.
Paper Type: long
Research Area: Discourse and Pragmatics
Contribution Types: Model analysis & interpretability, Reproduction study, Approaches to low-resource settings
Languages Studied: English
Consent To Share Submission Details: On behalf of all authors, we agree to the terms above to share our submission details.
0 Replies

Loading