Abstract: Good hyperparameter values are crucial for the performance of machine learning models. In particular, poorly chosen values can cause under- or overfitting in regression and classification. A common approach to hyperparameter tuning is grid search, but this is crude and computationally expensive, and the literature contains several more efficient automatic methods such as Bayesian optimization. In this work, we develop a Bayesian hyperparameter optimization technique with more robust performance, by combining several acquisition functions and applying a multi-objective approach. We evaluated our method using both classification and regression tasks. We selected four data sets from the literature and compared the performance with eight popular methods. The results show that the proposed method achieved better results than all others.
Loading