Contrastive Learning for Dependency Parsing on Free Word Order and Morphologically Rich Low Resource LanguagesDownload PDF

Anonymous

16 Feb 2024ACL ARR 2024 February Blind SubmissionReaders: Everyone
Abstract: Significant advancements have been made in the domain of dependency parsing, with researchers introducing novel architectures to enhance parsing performance. However, the majority of these architectures have been evaluated predominantly in languages with a fixed word order, such as English. Consequently, little attention has been devoted to exploring the robustness of these architectures in the context of relatively free word-ordered languages. In this work, we examine the robustness of graph-based parsing architectures on 4 relatively free word order languages. We focus on investigating essential modifications such as data augmentation and the removal of position encoding required to adapt these architectures accordingly. To this end, we propose a contrastive loss objective to make the model robust to word order variations. Furthermore, our proposed modification demonstrates a substantial average gain of 3.48/3.10 points in 4 relatively free word order languages, as measured by the Unlabelled/Labelled Attachment Score metric when compared to the best performing modifications.
Paper Type: short
Research Area: Syntax: Tagging, Chunking and Parsing / ML
Contribution Types: NLP engineering experiment, Approaches to low-resource settings
Languages Studied: Sanskrit, Telugu, Turkish, Gothic, English
0 Replies

Loading