Graph Neural Networks for Syntax Encoding in Cross-Lingual Semantic Role LabelingDownload PDF

Anonymous

16 Aug 2023ACL ARR 2023 August Blind SubmissionReaders: Everyone
Abstract: Recent models in cross-lingual semantic role labeling (SRL) barely consider the applicability of their network selection. They rely on LSTMs as their encoders, even though LSTMs do not transfer effectively to distant languages. We evaluate the effectiveness of different graph neural networks (GNNs) enriched with universal dependency trees, i.e., transformer-based, graph convolutional network-based, and graph attention network (GAT)-based models, and compare them with a BiLSTM-based model. We investigate which dependency-aware GNNs transfer best as an alternative encoder to LSTMs in cross-lingual SRL. We focus our study on a zero-shot setting by training the models in English and evaluating the models in 23 target languages in the Universal Proposition Bank. We consistently show that syntax from universal dependency trees is essential for cross-lingual SRL models to achieve better transferability. Dependency-aware self-attention with relative position representations (SAN-RPRs) transfer best across languages, especially in the long-range dependency distance. Furthermore, our proposed dependency-aware two-attention relational GATs perform better than SAN-RPRs in languages where most arguments lie in the 1-2 dependency distance.
Paper Type: long
Research Area: Semantics: Sentence-level Semantics, Textual Inference and Other areas
0 Replies

Loading