Keywords: GNNs, OOD Generalization, Domain Generalization, Semi-supervised Learning, LLMs, MoE
Abstract: Although graph neural networks (GNNs) have shown remarkable performance in graph machine learning, their effectiveness in practice often suffers from realistic challenges including distribution shifts and label scarcity. Towards this end, this paper studies the problem of semi-supervised domain generalization, which aims to improve the performance of GNNs on unseen graphs using both labeled and unlabeled data. We propose a novel approach named $\underline{L}$LM-Gu$\underline{i}$ded $\underline{G}$rap$\underline{h}$ Expert Rou$\underline{t}$ing (LIGHT) for semi-supervised domain generalization. The core idea of \method{} is to distill the knowledge from LLM-as-a-judge to determine context-aware routing weights for a multi-hop graph mixture-of-experts framework. In particular, our LIGHT employs diverse graph experts that explore neighborhood information at varying depths. More importantly, we leverage LLMs to provide judgments of the most reliable graph experts are for crucial nodes, which provide context-aware routing guidance with high generalizability for knowledge distillation. To further address label scarcity, we introduce an expert-aware dynamic pseudo-labeling strategies that selects reliable nodes for additional training. Extensive experiments on various benchmark datasets validate the effectiveness of the proposed LIGHT in comparison with competing approaches. Our source code can be found at $\url{https://anonymous.4open.science/r/LIGHT-A817}$.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 18186
Loading