Keywords: Graph Neural Network, Expressiveness, Integer Linear Programming, Large Neighborhood Search, Graph Decomposition, Learning to Optimize
TL;DR: We show that learning-based LNS methods share a common drawback—the independence assumption ignores variable coupling—and propose a graph-decomposition-based LNS that performs coupling-aware neighborhood correction.
Abstract: Large Neighborhood Search (LNS) is a heuristic for integer programming that iteratively destroys part of an incumbent solution and repairs it using an ILP solver to efficiently explore large solution spaces. Recent advances in neural LNS have shown strong performance on Integer Linear Programs (ILPs), where graph neural networks (GNNs) learn neighborhood-selection policies under an independence assumption. However, through an example, we identify that the independence assumption ignores variable coupling and assigns equal probability to neighborhoods with vastly different optimization potential. To overcome this limitation, we propose a coupling-enhanced neural LNS (CE-LNS). CE-LNS augments GNN-based neighborhood prediction with graph decomposition to explicitly capture variable coupling, enabling coupling-aware calibration of neighborhood selection. Theoretically, CE-LNS can (i) predict whether constraints are effective or redundant and (ii) refine neighborhood predictions to approximate optimal neighborhoods. Empirically, CE-LNS achieves stronger performance than existing neural LNS frameworks across diverse ILP benchmarks, demonstrating its effectiveness in escaping local optima.
Primary Area: optimization
Submission Number: 12765
Loading