Abstract: In this work, we focus on hyperparameter optimization, which is one of the essential steps in the development of machine learning solutions. This work proposes a novel approach to hyperparameter optimization using associative self-adapting structures, which allows us to efficiently explore the hyperparameter space. The core algorithm introduced in this paper is based on the associative graph data structure (AGDS) merged with the evolutionary approach, inspired by the processes used in drug discovery and human behavior. The proposed algorithm was compared with two state-of-the-art algorithms, a Tree-Structured Parzen Estimator and differential evolution. All experiments were carried out using Penn Machine Learning Benchmarks, repeated ten times for each problem, to provide objective comparisons. We optimized Random Forest and deep neural network models for every dataset of these benchmarks. The results reveal a noticeable improvement compared to both optimization methods in both optimized models, which shows that the presented approach can be successfully used as a good alternative for current state-of-the-art solutions.
Loading