An Adaptive Multi-Granular Pareto-Optimal Subspace Learning Algorithm for Sparse Large-Scale Multi-Objective Optimization

Published: 01 Jan 2025, Last Modified: 06 Nov 2025CEC 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Sparse large-scale multi-objective optimization problems are widespread across various domains, where traditional mathematical methods and many existing multi-objective evolutionary algorithms face difficulties in achieving satisfactory results. In this paper, we propose an Adaptive Multi-Granular Pareto-optimal Subspace Learning algorithm (AMG-PSL). The algorithm employs a multi-level decision space stratification mechanism based on variable importance, implements population partitioning through k-means clustering-derived sparsity metrics, and guides population evolution using a hierarchical mutation strategy in reduced subspaces constructed by unsupervised neural networks. The algorithm incorporates a feedback-based adaptation scheme that uses offspring performance to guide solution generation and adjusts neural network architectures according to the non-dominated solutions. Experimental validation across eight benchmark problems and two practical applications demonstrates that AMG-PSL achieves superior optimization results compared to existing strategies in the domain of sparse large-scale optimization.
Loading