Abstract: Deep neural networks (DNNs) need intensive tuning of their configurations such as network structures and learning conditions. The tuning is a type of black-box optimization problem where evolutionary algorithms are applicable. A distinctive property in evolutionary optimization of DNN configurations is that there is a double structure in the optimization; the evolutionary algorithm optimizes a chromosome representing the DNN configuration while an individual DNN with the configuration learns from training data typically by back-propagation. With an aim to obtain better-optimized DNNs by evolutionary algorithms, we propose a dual inheritance evolution strategy based on an analogy to human brain evolution where gene and culture co-evolves. The proposed method is an extension of a conventional evolution strategy by introducing an additional pass to directly propagate culture or knowledge from ancestor DNNs to descendant DNNs by integrating teacher-student learning. We apply the proposed method to the automatic tuning of an end-to-end neural network-based speech recognition system. Experimental results show that the proposed method produces a smaller model with higher recognition performance than a baseline optimization based on the Covariance Matrix Adaptation Evolution Strategy (CMA-ES).
0 Replies
Loading