Abstract: In practical applications, combinatorial optimization problems demonstrate intrinsic complexity, predominantly characterized by discrete decision variables and fitness evaluations that are both expensive and subject to noise. Surrogate-assisted evolutionary algorithms (SAEAs) are commonly used to solve expensive optimization problems in which expensive fitness evaluations are replaced by computationally cheaper surrogate models. The quality and quantity of training data are two crucial factors affecting surrogate models' accuracy, especially in noisy environments. Implicit and explicit averaging are two straightforward and effective noise-tolerant techniques, both entirely applicable to combinatorial optimization problems. In scenarios where fitness evaluations are subject to noise, and the allotted number of evaluations is constrained, implicit averaging tends to yield a considerable quantity of training data with diminished quality, whereas explicit averaging exhibits the opposite trend. This paper discusses which of these two noise-tolerant techniques is more suitable for embedding into SAEAs. The results of six multidimensional knapsack problems show that explicit averaging is a good choice, regardless of whether the noise type is additive or multiplicative.
Loading