Dependency Parsing with the Structuralized Prompt Template

ACL ARR 2024 June Submission4067 Authors

16 Jun 2024 (modified: 02 Aug 2024)ACL ARR 2024 June SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Dependency parsing is a crucial task in natural language processing that involves identifying syntactic dependencies to construct a structural tree of a sentence. Traditional models conduct dependency parsing by constructing embeddings and utilizing additional layers for prediction. We propose a novel method for performing dependency parsing using only a pre-trained encoder model with a text-to-text training approach. To facilitate this, we define the structured prompt template that effectively captures the structural information of the dependency tree. Our experimental results demonstrate that the proposed method achieves outstanding performance when comparing to traditional models, in spite of relying solely on an encoder model. Moreover, this method can be easily adapted to various encoder models that are suitable for different target languages or training environments, and it easily embody special features into the encoder models.
Paper Type: Short
Research Area: Syntax: Tagging, Chunking and Parsing
Research Area Keywords: dependency parsing
Contribution Types: NLP engineering experiment, Publicly available software and/or pre-trained models
Languages Studied: English, Korean, Bulgarian, Catalan, Czech, German, Spanish, French, Italian, Dutch, Norwegian, Romanian, Russian
Submission Number: 4067
Loading