Expensive Multiobjective Optimization Based on Information Transfer SurrogateDownload PDFOpen Website

Published: 01 Jan 2023, Last Modified: 12 May 2023IEEE Trans. Syst. Man Cybern. Syst. 2023Readers: Everyone
Abstract: Objective value estimation based on computationally efficient surrogate models is widely used to reduce the computational cost in solving expensive multiobjective optimization problems (MOPs). However, due to the scarcity of training data and the lack of data sharing between training tasks in a surrogate-based system, the estimation effectiveness of the surrogate models might not be satisfactory. In this study, we present a novel surrogate methodology based on information transfer to deal with this problem. Particularly, in the proposed framework, the objectives of an MOP that may have little apparent similarity or correlation are linearly mapped to a number of related tasks. Afterward, the related tasks are used to train a multitask Gaussian process (MTGP). MTGP expands the training data leading to more confident learning of the parameters of the model. The predicted values of the objective functions can be obtained by a reverse mapping from the learned MTGP model. In this way, the computational burden of the expensive objective functions of an MOP can be substantially reduced while maintaining good estimation accuracy. MTGP facilitates mutual information transfer across tasks, avoids learning from scratch for new tasks, and captures the underlying structural information between tasks. The proposed surrogate approach is merged into MOEA/D to address MOPs. Experimental tests under various scenarios indicate that the resultant algorithm outperforms other state-of-the-art surrogate-based multiobjective optimization algorithms.
0 Replies

Loading