SSD: Self-Supervised Distillation for Heterophilic Graph Representation Learning

Published: 2026, Last Modified: 21 Jan 2026IEEE Trans. Knowl. Data Eng. 2026EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Graph Knowledge Distillation (GKD) has made remarkable progress in graph representation learning in recent years. Despite its great success, GKD often obeys the label-dependence manner, which heavily relies on a large number of labels. Besides, we observe that GKD encounters the issue of embedding collapse, as merely maximizing the consistency between the teacher and student is insufficient for heterophilic graphs. To tackle these challenges, we propose a Self-Supervised Distillation framework named SSD. To realize label independence, the framework is conducted based on contrastive learning. Specifically, we design a Topology Invariance Block (TIB) and a Feature Invariance Block (FIB) to distill semantic invariance from unlabeled data. Each block includes a teacher-student architecture, which is trained by a projection-based contrastive loss. To avoid embedding collapse, the loss pays attention to two critical aspects: (1) Preserving consistency maximization between the same node representations related to teacher and student (positive pairs). (2) Ensuring consistency minimization between negative pairs, which include the final teacher and final student representation pairs and hidden teacher representation pairs. Under the guidance of self-distillation in each block, TIB captures the topology invariance while FIB learns the feature invariance. Additionally, cross-distillation is applied between two blocks, allowing each block to gain additional contrastive knowledge from each other, resulting in improved feature representations and enhanced classification performance. Comprehensive experimental results on 10 datasets demonstrate that our model achieves superior performance in the node classification task. In summary, SSD offers a novel paradigm for self-supervised knowledge distillation on graph-structured data.
Loading