The Power of Training: How Different Neural Network Setups Influence the Energy Demand

Published: 21 Feb 2024, Last Modified: 21 Feb 2024SAI-AAAI2024 OralEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Sustainability, Machine Learning, Energy Demand, Carbon Footprint
TL;DR: How Different Neural Network Setups Influence the Energy Demand
Abstract: This work examines the effects of variations in machine learning training regimes and learning paradigms on the corresponding energy consumption. While increasing data availability and innovation in high-performance hardware fuels the training of sophisticated models, it also supports the fading perception of energy consumption and carbon emission. Therefore, the goal of this work is to create awareness about the energy impact of general training parameters and processes, from learning rate over batch size to knowledge transfer. Multiple setups with different hyperparameter initializations are evaluated on two different hardware configurations to obtain meaningful results. Experiments on pretraining and multitask training are conducted on top of the baseline results to determine their potential towards sustainable machine learning.
Submission Number: 7
Loading