MetaOptimize: A Framework for Optimizing Step Sizes and Other Meta-parameters

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
Abstract: We address the challenge of optimizing meta-parameters (hyperparameters) in machine learning, a key factor for efficient training and high model performance. Rather than relying on expensive meta-parameter search methods, we introduce MetaOptimize: a dynamic approach that adjusts meta-parameters, particularly step sizes (also known as learning rates), during training. More specifically, MetaOptimize can wrap around any first-order optimization algorithm, tuning step sizes on the fly to minimize a specific form of regret that considers the long-term impact of step sizes on training, through a discounted sum of future losses. We also introduce lower-complexity variants of MetaOptimize that, in conjunction with its adaptability to various optimization algorithms, achieve performance comparable to those of the best hand-crafted learning rate schedules across diverse machine learning tasks.
Lay Summary: Machine learning systems rely on “meta-parameters” — such as how fast a model learns — which can greatly affect performance. Traditionally, researchers manually test many combinations of these settings before training a model, a slow and expensive process. Our research introduces **MetaOptimize**, a method that automatically adjusts these meta-parameters — especially learning rates — as the model trains. Instead of sticking to a fixed schedule, MetaOptimize learns how to update these settings in real time based on how training is going. This dynamic approach reduces the need for costly trial-and-error searches, improves training speed, and adapts better to changing environments, and works across a wide range of machine learning tasks and algorithms — all without the need for manual tuning.
Link To Code: https://github.com/sabersalehk/MetaOptimize
Primary Area: Optimization->Non-Convex
Keywords: Continual optimization, meta-parameter optimization
Submission Number: 13965
Loading