Kernel Complexity Reduced Graph Contrastive Learning for Noisy Node Classification

ICLR 2026 Conference Submission20456 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Kernel Complexity Reduced Graph Contrastive Learning, Generalization Bound, Noisy Node Classification
TL;DR: we propose Kernel Complexity Reduced Graph Contrastive Learning (KCR-GCL), a principled GCL framework for noisy classification with kernel complexity reduced self-attention and provable generalization guarantee.
Abstract: Graph Neural Networks (GNNs) have achieved remarkable success in learning node representations and have demonstrated strong performance on node classification. However, their effectiveness can be substantially compromised by noise in real-world graph data. To address this challenge, we propose Kernel Complexity Reduced Graph Contrastive Learning (KCR-GCL), a principled framework for noisy node classification with a provable transductive generalization guarantee. KCR-GCL introduces a novel KCR-GCL encoder, which incorporates a new KCR self-attention layer that adaptively balances different frequency components of the graph inspired by generalized graph convolution and reduces the kernel complexity for provably improved generalization for transductive learning. The KCR-GCL encoder is optimized with a low-rank regularization term through the truncated nuclear norm (TNN) on the gram matrix of the learned features. The learned low-rank representations are then used to train a linear classifier for transductive node classification in noisy graph data. The design of KCR-GCL is inspired by the Low Frequency Property (LFP) widely studied in general deep learning and node-level graph learning, and is further supported by a sharp generalization bound for transductive learning. To the best of our knowledge, KCR-GCL is among the first to theoretically reveal the benefits of low-rank regularization in transductive settings for noisy graph data. Experiments on standard benchmarks highlight the effectiveness and robustness of KCR-GCL in learning node representations under noisy conditions. The code of KCR-GCL is available at \url{https://anonymous.4open.science/status/KCR-GCL}.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 20456
Loading