KAGNNs: Kolmogorov-Arnold Networks meet Graph Learning

Published: 05 Mar 2025, Last Modified: 05 Mar 2025Accepted by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: In recent years, Graph Neural Networks (GNNs) have become the de facto tool for learning node and graph representations. Most GNNs typically consist of a sequence of neighborhood aggregation (a.k.a., message-passing) layers, within which the representation of each node is updated based on those of its neighbors. The most expressive message-passing GNNs can be obtained through the use of the sum aggregator and of MLPs for feature transformation, thanks to their universal approximation capabilities. However, the limitations of MLPs recently motivated the introduction of another family of universal approximators, called Kolmogorov-Arnold Networks (KANs) which rely on a different representation theorem. In this work, we compare the performance of KANs against that of MLPs on graph learning tasks. We implement three new KAN-based GNN layers, inspired respectively by the GCN, GAT and GIN layers. We evaluate two different implementations of KANs using two distinct base families of functions, namely B-splines and radial basis functions. We perform extensive experiments on node classification, link prediction, graph classification and graph regression datasets. Our results indicate that KANs are on-par with or better than MLPs on all tasks studied in this paper. We also show that the size and training speed of RBF-based KANs is only marginally higher than for MLPs, making them viable alternatives. Code available at https://github.com/RomanBresson/KAGNN.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: Addressed the action editor's concerns: -added an extensive and updated related work section to motivate the work; -added a KAN-based version of the GAT architecture; -updated the results with the new architecture, including link prediction.
Code: https://github.com/RomanBresson/KAGNN
Assigned Action Editor: ~Chuxu_Zhang2
Submission Number: 3806
Loading