Surrogate-assisted neural learning and evolutionary optimization for expensive constrained multi-objective problems

Published: 01 Jan 2025, Last Modified: 06 Nov 2025Swarm Evol. Comput. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Expensive constrained multi-objective optimization problems (ECMOPs) present significant challenges due to the high computational cost of evaluating objective and constraint functions, which severely limits the number of feasible function evaluations. To address this issue, we propose an efficient surrogate-assisted constrained multi-objective evolutionary algorithm, named LEMO. LEMO integrates neural learning with a novel constraint screening strategy to dynamically construct surrogate models for the most relevant constraints. During the optimization process, a neural network is designed to learn the mapping between arbitrary weight vectors and their corresponding constrained Pareto optimal solutions. This enables the generation of high-quality solutions while requiring fewer expensive function evaluations. Additionally, a constraint screening mechanism is introduced to dynamically exclude constraints that are irrelevant to the current search phase, thus simplifying the surrogate models and improving the efficiency of the constrained search process. To evaluate the effectiveness of LEMO, we compare its performance against seven state-of-the-art algorithms on three benchmark suites, LIRCMOP, DASCMOP, and MW, as well as a real-world optimization problem. The experimental results demonstrate that LEMO consistently outperforms these algorithms in both computational efficiency and solution quality.
Loading