On the Optimality Gap of Warm-Started Hyperparameter OptimizationDownload PDF

Published: 16 May 2022, Last Modified: 05 May 2023AutoML-Conf 2022 (Main Track)Readers: Everyone
Abstract: We study the general framework of warm-started hyperparameter optimization (HPO) where we have some source datasets (tasks) where we have already performed HPO, and we wish to leverage the results of these HPO to warm-start the HPO on an unseen target dataset (and perform few-shot HPO). Various meta-learning schemes have been proposed over the last decade (and more) for this problem. In this paper, we theoretically analyse the optimality gap of the hyperparameter obtained via such warm-started few-shot HPO, and provide novel results for multiple existing meta-learning schemes. We show how these results allow us identify situations where certain schemes have advantage over others.
Keywords: Meta learning, warm-starting, hyperparameter optimization, theoretical analysis, optimality gap
One-sentence Summary: We provide theoretical guarantees of various meta-learning based warm-started hyperparameter optimization
Track: Main track
Reproducibility Checklist: Yes
Broader Impact Statement: Yes
Paper Availability And License: Yes
Code Of Conduct: Yes
Reviewers: Already reviewer
CPU Hours: 0
GPU Hours: 0
TPU Hours: 0
Evaluation Metrics: No
Class Of Approaches: Meta-Learning, Sequential Model Based Optimization, Optimality Gap Analysis
Datasets And Benchmarks: NA
Performance Metrics: NA
Steps For Environmental Footprint Reduction During Development: NA
Estimated CO2e Footprint: 0
Benchmark Performance: NA
Benchmark Time: NA
Main Paper And Supplementary Material: pdf
7 Replies

Loading