Convolutional Networks on Enhanced Message-Passing Graph Improve Semi-Supervised Classification with Few LabelsDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Shallow Graph Networks, Multi-Channel Message Aggregation, Data Augmentation, Over-Smoothing, Overfitting
Abstract: Efficient message propagation is critical to node classification in sparse graph with few labels that remains largely unaddressed until now. Recently popularized Graph Convolutional Networks (GCNs) lack the ability to propagate message to distant nodes because of over-smoothing. Besides, GCNs with numerous parameters suffer from overfitting when labeled nodes are scarce. We attack this problem via building GCNs on Enhanced Message-Passing Graph (EMPG). The key idea is node classification can benefit from various variants of the original graph that are more efficient for message propagation, based upon the assumption that each variant is a potential structure as more nodes are properly labeled. Specifically, we first map nodes to a latent space through graph embedding that captures structure information. Considering node attributes together, we construct EMPG by adding connections between nodes in close proximity in the latent space. With the help of added connections, EMPG allows a node to propagate message to the right nodes at distance, so that GCNs on EMPG need not stack multiple layers and therefore avoid over-smoothing. However, adding connections may cause message propagation saturation or lead to overfitting. Seeing EMPG as an accumulation of the potential variants of the original graph, we apply dropout to EMPG and train GCNs on various dropout graphs. The features learned from different dropout EMPGs are aggregated to compute the final prediction. Experiments demonstrate a significant improvement on node classification in sparse graph with few labels.
One-sentence Summary: Two seemingly contradictory strategies, adding links elaborately and dropping edges randomly, actually complement one another and make a difference in node classification.
4 Replies

Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview