Exploring convolutional KAN architectures with NAS

Published: 09 Mar 2025, Last Modified: 11 Mar 2025MathAI 2025 OralEveryoneRevisionsBibTeXCC BY 4.0
Keywords: neural architecture search, kolmogorov-arnold networks, convolution layers
TL;DR: Convolutional KANs could perform better than it is described in papers
Abstract: This study addresses the challenge of evaluating emerging neural architectures against extensively optimized legacy models. Kolmogorov-Arnold networks (KANs) offer a potential alternative to conventional deep learning, yet their benefits remain difficult to quantify. We introduce a neural architecture search (NAS) framework that systematically optimizes and compares KANs with convolutional neural networks (CNNs), eliminating human design biases. Experiments on image classification (MNIST, Fashion-MNIST, EuroSAT) and sea ice concentration estimation reveal distinct performance characteristics, demonstrating the impact of automated optimization on architectural selection.
Submission Number: 33
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview