everyone
since 04 Oct 2024">EveryoneRevisionsBibTeXCC BY 4.0
Graph Neural Network (GNN) based node embedding methods are a promising approach to learning node representations for downstream tasks such as link prediction, node classification, and node clustering. GNN-based methods usually work in an unsupervised or semi-supervised manner, learning node representations without or with limited label information. We empirically show, however, that the performance of learned node embeddings on downstream tasks may be heavily impacted by the GNN-method's hyperparameter configuration. Unfortunately, existing hyperparameter optimisation methods typically rely on labeled data for evaluation, making them unsuitable for unsupervised scenarios. This raises the question: how can we tune the hyperparameters of GNNs without using label information to obtain high quality node embeddings? To answer this, we propose a framework for evaluating node embedding quality without relying on labels. Specifically, our framework consists of two steps: building prior beliefs that characterise high-quality node embeddings, and quantifying the extent to which those prior beliefs are satisfied. More importantly, we instantiate our framework from two different but complementary perspectives: spatial and spectral information. First, we introduce the Consensus-based Space Occupancy Rate (CSOR) method that evaluates node embedding quality from a spatial view. It conducts pairwise comparisons of the spatial distances between node embeddings obtained from various hyperparameter configurations. Next, we present the Spectral Space Occupancy Rate (SSOR) method, which takes a spectral perspective and evaluates the embedding quality by examining the singular values of the node embedding matrices. Extensive experiments on seven GNN models with four benchmark datasets demonstrate the effectiveness of both CSOR and SSOR. Specifically, both methods consistently prioritise hyperparameter configurations that yield high-quality node embeddings for downstream tasks.