Confirmation: I have read and agree with the workshop's policy on behalf of myself and my co-authors.
Tracks: Main Track
Keywords: Mixed-integer programming, Bayesian optimization, matheuristics, hyperpa-rameter tuning
TL;DR: This paper studies the systematic tuning of hyperparameters of matheuristics by Bayesian optimization on a use case with small- and large-scale instances.
Abstract: Mixed-integer programming can handle optimization problems with complex constraints, but its computational cost often suffers from the combinatorial complexity of the problem. Decomposition-based matheuristics address this issue by splitting large-scale mixed-integer programs (MIPs) into smaller sub-problems. Matheuristics typically exhibit hyperparameters that may affect their performance. An analysis of related work reveals that the optimization potential of hyperparameters is often left unexploited, leading to both inferi-or MIP-solutions and unnecessarily high computational costs.
This paper studies a novel algorithmic approach to tune hyperparameters of matheuristics by Bayesian optimization. Fundamental properties of the al-gorithmic approach are examined by computational experiments with small- and large-scale instances of the use case. The results exhibit two natural and competing objectives of the tuning problem: optimizing the MIP-objective and the computational cost. While the two objectives can be optimized sepa-rately for small-scale instances, they need to be handled jointly for large-scale instances.
In future research the multi-objective aspect of the hyperparameter tuning problem will be examined more deeply, and the single-instance approach will be extended to multiple instances.
Submission Number: 69
Loading