A Dual-Branch Super-Deep MambaPlusResGCN for Node Classification: Achieving Robustness Against Over-Smoothing
Abstract: Graph neural networks (GNNs) have become the standard paradigm for learning on non-Euclidean, graph-structured data. However, performance of modern GNN models deteriorates rapidly along with increment of depth due to the well-known over-smoothing effect, whereby node representations converge to nearly identical values. The prior methods proposed to tackle this problem employ neighbor sampling schemes, attention-based aggregation, or, most recently, selective state-space layers. In this work, we introduce a new dual-branch architecture, named MambaPlusResGCN, which combines the popular state-space backbone, Mamba, with a residual GraphSAGE branch, thereby preserving both global and local information flow in arbitrarily deep networks. Vanilla Graph Convolutional layers behave as fixed low-pass filters, while our model adaptively modulates signal frequencies while explicitly guarding against representational collapse. Experiments on seven node-classification benchmarks show that the proposed method can effectively maintain competitive accuracy with increasing depths and outperforms GCN, GraphSAGE, and recent deep-GNN baselines. Source codes are available at https://github.com/EchoAnxue/HW_UKCI35.git.
External IDs:doi:10.1007/978-3-032-07938-1_36
Loading