e$^2$HPO: energy efficient Hyperparameter Optimization via energy-aware multiple information source Bayesian optimization

Published: 04 Apr 2025, Last Modified: 09 Jun 2025LION19 2025EveryoneRevisionsBibTeXCC BY 4.0
Confirmation: I have read and agree with the workshop's policy on behalf of myself and my co-authors.
Tracks: Special Session 3: Sustainability in surrogate models, Bayesian optimization, and parameter tuning
Keywords: GreenAutoML, Hyperparameter optimization, Bayesian Optimization
Abstract: The disclosure of Artificial Intelligence to everyone is significantly pushing the need for resource – especially energy – efficient Machine Learning models. While it is well-established that Artificial Intelligence can enable and support sustainability in different application domains, its own sustainability is a critical concern and an open challenge for research and industry. The need for more accurate Machine Learning models clashes with the fact that a linear gain in accuracy requires exponentially larger resources: a more complex model, more training data and experiments, and consequently more computational resources, entailing a higher energy consumption. This paper proposes an energy efficient hyperparameter optimization algorithm – namely e$^2$HPO – integrating into a unique schema recent advances on both cost-aware and multiple information source Bayesian optimization. Experiments on three common Machine Learning algorithms whose core hyperparameters have been optimized on five different classification datasets empirically prove the benefits of the proposed algorithm. On the other hand, it turned out that some Machine Learning algorithms exhibit an intrinsic energy efficiency and this could lead e$^2$HPO – and similar approaches – to underperform with respect to more naive approaches.
Submission Number: 21
Loading