AutoML With Parallel Genetic Algorithm for Fast Hyperparameters Optimization in Efficient IoT Time Series PredictionDownload PDFOpen Website

Published: 01 Jan 2023, Last Modified: 09 Nov 2023IEEE Trans. Ind. Informatics 2023Readers: Everyone
Abstract: With the development of artificial intelligence and the improvement of hardware computing power, deep learning models have become widely used in the Internet of Things (IoT) field, especially for analyzing spatiotemporal data collected by wireless sensors. Recurrent neural networks (RNNs) such as long short-term memory (LSTM) network are generally used for these time-series data. Hyperparameter settings of model training are regarded as essential factors for the performance of deep learning models. Manually optimizing hyperparameters not only cost more resources but also be more likely to set hyperparameters that follow stereotypes, resulting in unreasonable hyperparameter settings and poor model performance. As one of the most important fields in automated machine learning research, automated hyperparameter optimization (HPO) mainly includes grid search, hyperparameter search based on genetic algorithm, etc. Whereas these methods have their own drawbacks. In this article, an automated HPO method based on parallel genetic algorithm (PGA) is proposed. According to the process of PGA, this article divided HPO into several stages, including population initialization, fitness function, tournament selection, crossover operators, mutation operators, subgroup exchange, and end of evolution. Then, the proposed HPO method is implemented in LSTM models and tested on two different time-series datasets collected by real-world IoT sensors. By comparing our proposed method with other mainstream HPO methods in different datasets, it is proved that our HPO method based on PGA shows a better performance on both time costs and prediction results.
0 Replies

Loading