Keywords: Spectral Annealing, Ising Model, Combinatorial Optimization, Pertubation
TL;DR: Smooth transition of spectral partitioning for scalable Ising model solvers, achieving the efficiency of spectral methods with solution quality of expensive solvers
Abstract: Large-scale Ising optimization underlies critical optimization and learning applications, including correlation clustering, MAP inference, and energy-based models. Existing methods face a fundamental efficiency-quality tradeoff: spectral relaxations achieve polynomial-time complexity but deliver approximate solutions with significant optimality gaps, while metaheuristics approach optimal solutions but scale poorly to large instances. The core limitation of spectral methods is their single-shot nature—they solve exactly one eigenvalue problem, providing no solution exploration and exhibiting acute sensitivity to normalization choices.
We introduce *spectral annealing* (SpecAnn), a new spectral paradigm that transforms spectral methods from single-shot approximations into systematic exploration-based optimization, annealing from the raw adjacency to full signed Laplacian normalization. We propose a *diagonal predictor* that exploits the algebraic structure of this continuation path for efficient traversal and an efficient *batched solution refinement*, leveraging massive parallelism from GPU-acceleration. Experiments on problems up to 8.4 million variables demonstrate that SpecAnn achieves near-optimal solutions with runtimes of 1--57 seconds, bridging fast but inaccurate spectral methods and slow but high-quality metaheuristics. SpecAnn delivers 83$\times$ speedup over recent GPU-accelerated methods while maintaining superior solution quality, and scales successfully beyond 262K variables where traditional metaheuristics fail.
Primary Area: optimization
Submission Number: 24332
Loading