Hybrid self-attention NEAT: a novel evolutionary self-attention approach to improve the NEAT algorithm in high dimensional inputs

Published: 2024, Last Modified: 19 Jul 2025Evol. Syst. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: This article presents a “Hybrid Self-Attention NEAT” method to improve the original NeuroEvolution of Augmenting Topologies (NEAT) algorithm in high-dimensional inputs. Although the NEAT algorithm showed a significant result in different challenging tasks, as input representations are highly dimensional, it cannot create a well-tuned network. Accordingly, we decided to overcome this limitation by using the Self-Attention technique as an indirect encoding method to select the most important parts of the input. In order to tune the hyper-parameters of the self-attention module, we used the CMA-ES evolutionary algorithm. Also, an innovative method called Seesaw is presented in this article to evolve populations of the NEAT and CMA-ES algorithms simultaneously. Besides the evolutionary operators of the NEAT algorithm to update the weights, we used a combination method to reach more fitting weights. We tested our model on a variety of Atari games. The results showed that, compared to state-of-the-art evolutionary algorithms, Hybrid Self-Attention NEAT could eliminate the restriction of the original NEAT and achieve comparable scores with raw pixel input while using much smaller (e.g. approximately 300 × against HyperNEAT) number of parameters.
Loading