Abstract: This paper studies the problem of semi-supervised learning on graphs, which has recently aroused widespread interest in relational data mininThe focal point of exploration in this area has been the utilization of graph neural networks (GNNs), which stand out for excellent performance. Previous methods, however, typically rely on the limited labeled data while ignoring the abundant structural information in unlabeled nodes inherently on graphs, easily resulting in overfitting, especially in scenarios where only a few label nodes are available. Even worse, GNNs, despite their success, are constrained by their ability to solely capture local neighborhood information through message-passing mechanisms, thereby falling short in modeling higher-order dependencies among nodes. To circumvent the above drawbacks, we propose a simple yet effective framework called Hypergraph COnsistency LeArning (HOLA). Specifically, we employ a collaborative distillation framework consisting of a teacher network and a student network. To achieve effective interaction, we propose momentum distillation, a self-training method that enables the student network to learn from pseudo-targets generated by a momentum teacher network. Further, a novel hypergraph structure learning network is developed to model complex high-order relations among nodes with relational consistency learning, thereby transferring the knowledge to the student network. Extensive experiments conducted on a variety of benchmark datasets demonstrate the superior performance of the HOLA over various state-of-the-art methods.
External IDs:dblp:journals/tmm/YiMWGXCHZJ25
Loading