Beyond Message Passing Paradigm: Training Graph Data with Consistency ConstraintsDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Graph Learning, Multilayer Perceptrons, Consistency Constraints
Abstract: Recent years have witnessed great success in handling graph-related tasks with Graph Neural Networks (GNNs). However, most existing GNNs are based on powerful message passing to guide feature aggregation among neighbors. Despite their success, there still exist three weaknesses that limit their capacity to train graph data: weak generalization with severely limited labeled data, poor robustness to label noise and structure disturbation, and high computation and memory burden for keeping the entire graph. In this paper, we propose a simple yet effective Graph Consistency Learning (GCL) framework, which is based purely on multilayer perceptrons, where structure information is only implicitly incorporated as prior knowledge in the computation of supervision signals but does not explicitly involve the forward. Specifically, the GCL framework is optimized with three well-designed consistency constraints: neighborhood consistency, label consistency, and class-center consistency. More importantly, we provide theoretical analysis on the connections between message passing and consistency constraints. Extensive experiments show that GCL produces truly encouraging performance with better generalization and robustness compared with other leading methods.
One-sentence Summary: We propose a novel framework, which is based purely on MLP, where structure information is only implicitly incorporated as prior knowledge in the computation of supervision signals but does not explicitly involve the forward.
5 Replies

Loading