HyperTuner: a cross-layer multi-objective hyperparameter auto-tuning framework for data analytic services
Abstract: Hyperparameters optimization (HPO) is vital for machine learning models. Besides model accuracy, other tuning intentions such as model training time and energy consumption are also worthy of attention from data analytic service providers. Therefore, it is essential to take both model hyperparameters and system parameters into consideration to execute cross-layer multi-objective hyperparameter auto-tuning. Toward this challenging target, we propose HyperTuner in this paper which leverages a well-designed ADUMBO algorithm to find the Pareto-optimal configuration set. Compared with vanilla Bayesian optimization-based methods, ADUMBO selects the most promising configuration from the generated Pareto candidate set during each iteration via maximizing a novel adaptive uncertainty metric. We evaluate HyperTuner on our local distributed TensorFlow cluster, and experimental results show that it is always able to find a better Pareto configuration front superior in both convergence and diversity compared with the other four baseline algorithms. Besides, experiments with different training datasets, different optimization objectives, and different machine learning platforms verify that HyperTuner can well adapt to various data analytic service scenarios.
Loading