Automatic Parameter Optimisation Framework for ECoS-based Models

Published: 01 Jan 2023, Last Modified: 18 May 2025IJCNN 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Over the past decade there has been efforts to address a major issue in obtaining high-performing Machine Learning (ML) models; that of more systematic methods to establish the optimal hyper-parameters for learning involving automated hyper-parameter selection which would increase the accuracy of these ML models. These methods have largely been driven by the popularity of the development of deep learning models, and traditional ML models have also benefited from these advances. One class of ML models where these hyper-parameter optimization techniques have rarely been applied is to a class of neural-network based learners called Evolving-Connectionist Systems (ECoS). To determine whether ECoS-based learners could improve their performance by benefitting from such hyper-parameter optimization methods is the focus of this research. In this paper we investigate how three different techniques for hyper-parameter optimization govern the learning of two variants of ECoS-based learners: The Self-Evolving Connectionist System (SECoS) and the Evolving Fuzzy Neural Network (EFuNN). Results of these experiments indicate that the Tree-structured Parzen Estimator hyper-parameter optimization algorithm works well for both SECoS and EFuNN learning on classification tasks. There are, however, unanswered questions as to the degree to which the efficacy of an appropriate hyper-parameter optimization framework can be adopted for ECoS-based learners.
Loading