Edge-Splitting MLP: Node Classification on Homophilic and Heterophilic Graphs without Message Passing
Keywords: Node classification, Heterophily, GNN Architectures
TL;DR: Generalizing Graph-MLP to heterophilic graphs.
Abstract: Message Passing Neural Networks (MPNNs) have demonstrated remarkable success in node classification on homophilic graphs.
It has been shown that they do not solely rely on homophily but on neighborhood distributions of nodes, i.e., consistency of the neighborhood label distribution within the same class.
MLP-based models do not use message passing, i.e.,Graph-MLP incorporates the neighborhood in a separate loss function.
These models are faster and robust to edge noise.
Graph-MLP maps adjacent nodes closer in the embedding space but is unaware of the neighborhood pattern of the labels, i.e., relies solely on homophily.
Edge-Splitting GNN (ES-GNN) is a model specialized for heterophilic graphs and splits the edges into task-relevant and task-irrelevant, respectively.
To mitigate the limitations of Graph-MLP on heterophilic graphs, we propose ES-MLP that combines Graph-MLP with an edge-splitting mechanism from ES-GNN.
It incorporates the edge splitting into the loss of Graph-MLP to learn two separate adjacency matrices based on relevant and irrelevant feature pairs.
Our experiments on seven datasets with five baselines show that ES-MLP is on par with homophilic and heterophilic models on all datasets without using edges during inference.
We show that ES-MLP is robust to multiple types of edge noise during inference and its inference time is two to five times faster than commonly used MPNNs.
We will make our source code available.
Supplementary Materials: zip
Submission Type: Full paper proceedings track submission (max 9 main pages).
Software: https://github.com/MatthiasKohn/ES-MLP
Poster: jpg
Poster Preview: jpg
Submission Number: 133
Loading