Keywords: neural architecture search, kolmogorov-arnold networks, convolution layers
TL;DR: Convolutional KANs could perform better than it is described in papers
Abstract: This study addresses the challenge of evaluating emerging neural architectures against extensively optimized legacy models. Kolmogorov-Arnold networks (KANs) offer a potential alternative to conventional deep learning, yet their benefits remain difficult to quantify. We introduce a neural architecture search (NAS) framework that systematically optimizes and compares KANs with convolutional neural networks (CNNs), eliminating human design biases. Experiments on image classification (MNIST, Fashion-MNIST, EuroSAT) and sea ice concentration estimation reveal distinct performance characteristics, demonstrating the impact of automated optimization on architectural selection.
Submission Number: 33
Loading