Exploring Self-Adaptive Genetic Algorithms to Combine Compact Sets of Rules

Published: 2024, Last Modified: 10 Feb 2025CEC 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Rule-based machine learning (RBML) models are often presumed to be very beneficial for tasks where explainabil-ity of machine learning models is considered essential. However, their models are only really explainable as long as their rule sets are compact. This leads to the need for an optimizer to take prediction error and rule count as objectives. Given the highly complex fitness landscape of rule set learning tasks, good hyperparameters of the optimizer as well as their robustness against local minima is detrimental. In this paper, we explore the use of four self-adaptive genetic algorithms (SAGAs) for the optimization of a recent evolutionary RBML system to reduce the number of hyperparameters to tune and hopefully find better minima. To evaluate the advantages, we benchmark against a non-adaptive genetic algorithm (GA) on five real-world data sets. We find-with the support of a rigorous statistical analysis-that some of the SAGAs deliver a suitable alternative, which is easier to handle for non-experts in GA configurations. This is crucial for a wider application of this RBML method.
Loading