Abstract: Aspect-level sentiment classification (ALSC) aims to predict the sentiment polarity of specific aspects in the input text. In recent years, given the advantages of graph neural networks (GNNs) in capturing structural information, an increasing number of studies have integrated them with dependency trees for ALSC, achieving state-of-the-art results. However, existing methods do not fully exploit the information within dependency trees. In this paper, we propose a syntax-based residual graph attention network (SRGAT) that simultaneously uses three types of syntactic information: dependency relation, dependency distance, and part-of-speech, to capture aspect-related sentiment features. Results from comparative experiments on four benchmark datasets show that our proposed model outperforms a range of state-of-the-art models. Additionally, ablation studies demonstrate the effectiveness of these three types of syntactic information for ALSC.
Loading