Beyond Graph-Based Modeling for Hierarchical Neural Architecture Search

Published: 12 Jul 2024, Last Modified: 09 Aug 2024AutoML 2024 WorkshopEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Hierarchical Neural Architecture Search, Neural Architecture String Kernel, NASK, hNASK, BOHNAS, Bayesian Optimization, AutoML
TL;DR: We introduce hNASK, a string kernel for hierarchical neural architecture search that is able to take advantage of hierarchical search spaces and preserves the performance of BOHNAS
Abstract: Neural Architecture Search (NAS) seeks to automate the discovery of well-performing neural architectures. Recently, a hierarchical approach to NAS (hNAS) has been shown to allow for high search space expressiveness and efficient searching. However, BOHNAS, the current best strategy for hNAS requires the conversion of sampled networks from their native string representations to graphs, complicating the extension of hierarchical search spaces to include not only architectures, but also other pipeline components, such as hyperparameters, learning rate schedules and data augmentation. In this work, we introduce hNASK, a string kernel that operates on such string representations, is able to take advantage of the hierarchical structure of the search space and preserves the performance of BOHNAS on the performed architecture search experiments. As such, this kernel opens the door for future work in hNAS without being constrained to graph-based modeling of search spaces. Code is available at https://github.com/automl/hnas_with_string_kernels.
Submission Checklist: Yes
Broader Impact Statement: Yes
Paper Availability And License: Yes
Code Of Conduct: Yes
Optional Meta-Data For Green-AutoML: All questions below on environmental impact are optional.
Submission Number: 20
Loading