You Never Walk Alone: A Generalizable and Nonparametric Structure Learning Framework

Jiaqiang Zhang, Xinrui Wang, Songcan Chen

Published: 01 Jan 2025, Last Modified: 05 Nov 2025IEEE Transactions on Neural Networks and Learning SystemsEveryoneRevisionsCC BY-SA 4.0
Abstract: The graph-structured learning (GSL) aims to assist graph neural networks (GNNs) to yield effective node embeddings for downstream tasks, especially in scenarios with the absence of structures or the existence of unreliable edges. Most GSL models are built on i.i.d. assumption across training and testing data. However, this assumption can be violated, where testing data contain out-of-distribution (OOD) samples. Consequently, those models are limited in generalization, which will lead to a poor structure. On the other hand, while they have made great progress, additional optimized parameters are required due to their implementation with parametric models. To tackle the above problems, we propose a novel generalizable and nonparametric structure learning framework named GNS, which can be easily and effectively applied to various tasks. GNS neither relies on i.i.d. assumption nor even involves any parameters being optimized, instead to find an appropriate similarity between nodes and an associated threshold to establish desirable structures. Specifically, we first incorporate the candidate neighbor distributions for nodes to refine the similarity. Then, we introduce an adaptive threshold discovery method inspired by Fisher’s criterion to determine final structures. Extensive experiments demonstrate that GNS excels not only in OOD scenarios but also in the general classification and regression prediction tasks.
Loading