Multi-objective Hyperparameter Optimization in the Age of Deep Learning

ICLR 2026 Conference Submission25300 Authors

20 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Hyperparameter Optimization, Multi-objective, Deep Learning
TL;DR: We propose to use multi-objective expert priors to make hyperparameter optimization for expensive deep learning workloads feasible and show our algorithm PriMO achieves state-of-the-art performance in the multi-objective and single-objective setting.
Abstract: While Deep Learning (DL) experts often have prior knowledge about which hyperparameter settings yield strong performance, only few Hyperparameter Optimization (HPO) algorithms can leverage such prior knowledge and none incorporate priors over multiple objectives. As DL practitioners often need to optimize not just one but many objectives, this is a blind spot in the algorithmic landscape of HPO. To address this shortcoming, we introduce PriMO, the first HPO algorithm that can integrate multi-objective user beliefs. We show PriMO achieves state-of-the-art performance across 8 DL benchmarks in the multi-objective _and_ single-objective setting, clearly positioning itself as the new go-to HPO algorithm for DL practitioners.
Primary Area: optimization
Submission Number: 25300
Loading