Enhancing Grid Tagging Scheme with Hybrid Contrastive Learning for Aspect Sentiment Triplet Extraction

Guangmin Zheng, Jin Wang, Liang-Chih Yu, Xuejie Zhang

Published: 01 Jan 2025, Last Modified: 15 Mar 2026IEEE Transactions on Computational Social SystemsEveryoneRevisionsCC BY-SA 4.0
Abstract: Aspect sentiment triplet extraction (ASTE) jointly extracts aspect terms, opinion terms, and their associated sentiment polarities. The main challenge is accurately matching aspect–opinion pairs and modeling their sentiment relationships. Recent approaches have attempted to model syntactic dependencies by stacking complex graph neural networks (GNNs) on top of transformers. However, these approaches often have noisy connections, incur substantial parameter overhead and are prone to overfitting under limited supervision. Moreover, these approaches tend to overlook the improvement in sentiment representation. To address this issue, this study proposes an improved grid tagging scheme with hybrid contrastive learning (GTS-HCL), which introduces minimal additional parameters and uses self-supervised signals from contrastive learning (CL). Specifically, sentiment-aware contrastive learning (SCL) is applied by introducing word-level continuous sentiment information in the valence–arousal (VA) space. Syntactic-aware graph contrastive learning (SGCL) is used to guide the language model in obtaining dependencies. Supervised tagging contrastive learning (STCL) is further adapted to guide the language model in distinguishing both positive and negative samples. The experimental results on the ASTE-Data-V2 dataset reveal that GTS-HCL significantly outperforms previous state-of-the-art models.
Loading