A Semi-Autoregressive Graph Generative Model for Dependency ParsingDownload PDF

Anonymous

16 Jul 2022 (modified: 05 May 2023)ACL ARR 2022 July Blind SubmissionReaders: Everyone
Abstract: Recent years have witnessed impressive progress in Neural Dependency Parsing. According to the different factorization approaches to the graph joint probabilities, existing parsers can be roughly divided into autoregressive and non-autoregressive patterns. The former means that the graph should be factorized into multiple sequentially dependent components, then it can be built up component by component. And the latter assumes these components to be independent so that they can be outputted at once. However, when treating the directed edge in the dependency graph as an explicit dependency, we discover that there is a mixture of independent and interdependent components in the dependency graph, signifying that both fail to precisely capture the explicit dependencies among nodes and edges. Based on this property, we design a Semi-Autoregressive Dependency Parser to generate dependency graphs via adding node groups and edge groups autoregressively while pouring out all group elements in parallel. The model meanwhile deals with two problems in graph generation with respect to the uncertainty of generation orders and edge sparsity, via introducing a novel concept of Topological Hierarchy and a Graph Transformer as the decoder. The experiments show the proposed parser outperforms strong baselines on Enhanced Universal Dependencies of $14$ languages. Also, the performances of model variations show the importance of specific parts.
Paper Type: long
0 Replies

Loading