GraphFusion: Adaptive Label Enhancement with Heterophily-Aware Attention and Dynamic Residual Calibration

18 Sept 2025 (modified: 25 Sept 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: graph neural networks, semi-supervised learning, label propagation, heterophily-aware attention, contrastive learning, dynamic residual calibration, graph transformers
TL;DR: A scalable graph learning framework combining contrastive pretraining, heterophily-aware attention, and dynamic label smoothing for robust node classification.
Abstract: We propose GraphFusion, a unified and scalable framework for graph-based semi-supervised learning that integrates high-order feature augmentation, contrastive self-supervised pretraining, and transformer-style attention. GraphFusion enhances node representations using multi-hop SIGN embeddings and contrastive alignment, followed by a lightweight Graphormer module with heterophily-aware attention. To improve label propagation, we introduce a dynamic residual calibration mechanism that adaptively adjusts smoothing strength per node based on model confidence. Our framework supports multi-label classification and ensemble fusion across multiple pathways. Extensive experiments on OGBN-ArXiv, OGBN-Products, and OGBN-Proteins demonstrate that GraphFusion outperforms strong baselines in both homophilous and heterophilous settings, while maintaining high efficiency and interpretability.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: true
Submission Guidelines: true
Anonymous Url: true
No Acknowledgement Section: true
Submission Number: 12902
Loading