SynPrompt: Syntax-aware Enhanced Prompt Engineering for Aspect-based Sentiment Analysis

Published: 01 Jan 2024, Last Modified: 20 Apr 2025LREC/COLING 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Although there have been some works using prompt learning for the Aspect-based Sentiment Analysis(ABSA) tasks, their methods of prompt-tuning are simple and crude. Compared with vanilla fine-tuning methods, prompt learning intuitively bridges the objective form gap between pre-training and fine-tuning. Concretely, simply constructing prompt related to aspect words fails to fully exploit the potential of Pre-trained Language Models, and conducting more robust and professional prompt engineering for downstream tasks is a challenging problem that needs to be solved urgently. Therefore, in this paper, we propose a novel Syntax-aware Enhanced Prompt method (SynPrompt), which sufficiently mines the key syntactic information related to aspect words from the syntactic dependency tree. Additionally, to effectively harness the domain-specific knowledge embedded within PLMs for the ABSA tasks, we construct two adaptive prompt frameworks to enhance the perception ability of the above method. After conducting extensive experiments on three benchmark datasets, we have found that our method consistently achieves favorable results. These findings not only demonstrate the effectiveness and rationality of our proposed methods but also provide a powerful alternative to traditional prompt-tuning.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview