KA-GAT: Kolmogorov–Arnold based Graph Attention Networks

ICLR 2025 Conference Submission1509 Authors

18 Sept 2024 (modified: 22 Nov 2024)ICLR 2025 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Graph Neural Networks, Kolmogorov-Arnold Networks, Graph Attention Networks, Multi-head Attention Mechanism, Model Interpretability
TL;DR: KA-GAT is a graph neural network combining KAN and GAT, optimised for high-dimensional data processing and model interpretability.
Abstract: Graph Neural Networks (GNNs) have demonstrated remarkable capabilities in processing graph-structured data, but they often struggle with high-dimensional features and complex, nonlinear relationships. To address these challenges, we propose KA-GAT, a novel model that integrates Kolmogorov-Arnold Networks (KANs) with Graph Attention Networks (GATs). KA-GAT leverages KAN to decompose and reconstruct high-dimensional features, enhancing representational capacity, while a multi-head attention mechanism dynamically focuses on key graph components, improving interpretability. Experimental results on benchmark datasets, including Cora and Citeseer, demonstrate that KA-GAT achieves significant accuracy improvements compared to baseline models like GAT, with a relative gain of 4.5\% on Cora. These findings highlight KA-GAT’s robustness and potential as an interpretable and scalable solution for high-dimensional graph data, paving the way for further advancements in GNN research.
Supplementary Material: zip
Primary Area: learning on graphs and other geometries & topologies
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 1509
Loading