GATology for Linguistics: Syntactic Dependencies and ComplementarityDownload PDF

Anonymous

03 Sept 2022 (modified: 05 May 2023)ACL ARR 2022 September Blind SubmissionReaders: Everyone
Abstract: Graph Attention Network (GAT) is a novel graph neural network that can process and represent types of different linguistic information using a graph structure. Although GAT and syntactic knowledge can primarily be used in downstream tasks and help in performance improvement, there is still a lack of discussion on what syntactic knowledge GAT is good at learning compared to other neural networks. Therefore, we investigate the robustness of GAT for syntactic dependency prediction in three different languages in terms of attention heads and the number of model layers. We can obtain optimal results when the number of attention heads increases and the number of layers is 2. We also use paired t-test and F1-score to test the prediction of GAT and the pre-trained model BERT fine-tuned by the Machine Translation (MT) task for syntactic dependencies. We analyze their differences in syntactic dependencies, which can lead to syntactic complementarity in their predictions and the possibility of them working together on downstream tasks. We find that GAT is competitive in syntactic dependency prediction, producing good syntactic complementarity with BERT fine-tuned to MT in most cases, while BERT specifically fine-tuned to the dependency prediction task produces better results than GAT.
Paper Type: long
0 Replies

Loading