Abstract: This paper takes a crucial step in the development of energy-aware (EA) NAS methods by offering a benchmark that enhances the reproducibility and accessibility of EA-NAS research. Specifically, we introduce EA-HAS-Bench, the first large-scale energy-aware benchmark designed to enable the study of AutoML methods in achieving improved trade-offs between performance and search energy consumption. EA-HAS-Bench offers a vast architecture/hyperparameter joint search space, encompassing diverse configurations relevant to energy consumption, and proposes a novel surrogate model based on Bézier curves for predicting learning curves with versatile shapes and lengths. On the other hand, recent studies have started integrating large language models (LLMs) into AutoML frameworks to enhance model search efficiency and configuration prediction, yet challenges remain in adapting these methods for energy-efficient searches across vast configuration spaces, as they often neglect energy consumption metrics. As a result, we introduce the Language-Enhanced Shrinkage Search (LESS), a plug-and-play method that utilizes the analytical capabilities of LLMs to enhance the energy efficiency of existing hyperparameter optimization techniques. Moreover, we adapt existing AutoML algorithms to construct baselines. Our experiments demonstrate that these modified energy-aware AutoML methods and LESS achieve an improved balance between energy consumption and model performance.
External IDs:dblp:journals/pami/ZhaoDZJGZLL25
Loading