Improve Discourse Dependency Parsing with Contextualized RepresentationsDownload PDF

Anonymous

08 Mar 2022 (modified: 05 May 2023)NAACL 2022 Conference Blind SubmissionReaders: Everyone
Paper Link: https://openreview.net/forum?id=p6Gx8159ll4
Paper Type: Long paper (up to eight pages of content + unlimited references and appendices)
Abstract: Previous works show that discourse analysis benefits from modeling intra- and inter-sentential levels separately, where proper representations for text units of different granularities are desired to capture both the information of the text units and their relation to the context. In this paper, we propose to take advantage of transformers to encode different contextualized representations of units of different levels to dynamically capture the information required for discourse dependency analysis on intra- and inter-sentential levels. Motivated by the observation of writing patterns shared across articles to improve discourse analysis, we propose to design sequence labeling methods to take advantage of such structural information from the context that substantially outperforms traditional direct classification methods. Experiments show that our model achieves state-of-the-art results on both English and Chinese datasets.
Presentation Mode: This paper will be presented virtually
Copyright Consent Signature (type Name Or NA If Not Transferrable): Yifei Zhou
Copyright Consent Name And Address: Yifei Zhou, Cornell University
0 Replies

Loading