FSEO: Few-Shot Evolutionary Optimization via Meta-Learning for Expensive Multi-Objective Optimization
Keywords: Expensive optimization, meta-learning, few-shot optimization, multi-objective optimization
Abstract: Meta-learning has been demonstrated to be useful to improve the sampling efficiency of Bayesian optimization (BO) and surrogate-assisted evolutionary algorithms (SAEAs) when solving expensive optimization problems (EOPs).
Existing studies mainly focus on either combinations of existing meta-learning modeling methods with optimization algorithms, or the development of meta-learning acquisition functions for specific meta BO. However, the meta-learning models used in the literature are not designed for optimization purpose, and the generalization ability of meta-learning acquisition functions is limited.
In this work, we develop a novel architecture of meta-learning model for optimization purpose and propose a generalized few-shot evolutionary optimization (FSEO) framework to solve EOPs.
We focus on the scenario of expensive multi-objective EOPs (EMOPs) in the context of few-shot optimization as there are few studies on it and its high requirement on surrogate modeling performance.
The surrogates in FSEO framework combines neural network with Gaussian Processes (GPs), their network parameters and some parameters of GPs represent task-independent experience and are meta-learned across related optimization tasks, the remaining GPs parameters are task-specific parameters that represent unique features of the target task.
We demonstrate that our FSEO framework is able to improve the sampling efficiency of existing SAEAs on EMOPs.
Supplementary Material: zip
Primary Area: Optimization (e.g., convex and non-convex, stochastic, robust)
Submission Number: 5557
Loading