MambaForDIF: Distance-Importance Features and Long-Range Dependencies for Enhancing Aspect-Based Sentiment Analysis
Abstract: In Aspect-based Sentiment Analysis (ABSA), utilizing graph neural networks (GNNs) to exploit syntactic structures derived from dependency parsing has proven effective in enhancing ABSA. However, most existing studies mainly focus on modeling dependency relationships using graph topology or attention coefficients, which may limit the effective utilization of syntactic information. Additionally, the complexity of attention mechanisms restricts neural networks’ capability to measure long-range dependencies, thus limiting model effectiveness to short-range relationships. To overcome these limitations, a novel approach, termed MambaDIF, is proposed to enhance long-range dependency modeling by improving distance-based dependency importance calculations and MambaFormer module. Specifically, the proposed MambaDIF introduces a refined distance importance function to better model syntactic dependencies. Additionally, it incorporates the MambaFormer module, which effectively encodes inputs containing both dependency and semantic information. In the MambaFormer module, Multi-Head Attention (MHA) and the Mamba blocks work in tandem to simultaneously capture short-term and long-term dependencies. Additionally, Graph Convolutional Networks (GCNs) and semantic GCNs are incorporated to leverage triple learning and orthogonal projection techniques, effectively extracting multi-level information. Extensive experiments were carried out on three benchmark datasets. The experimental results demonstrate that the MambaDIF model consistently outperforms baselines.
Loading