Graph Neural Networks for Hyperparameter Inference in Ising Solvers

Published: 10 Oct 2024, Last Modified: 07 Dec 2024NeurIPS 2024 WorkshopEveryoneRevisionsBibTeXCC BY 4.0
Keywords: graph neural networks, GNN, combinatorial optimization, Ising solver, parameter tuning
TL;DR: Graph neural networks are trained to predict optimal hyperparameters for a heuristic Ising solver
Abstract: We propose a novel method to apply graph neural networks (GNNs) to combinatorial optimization problems. Unlike existing approaches that use GNNs to directly solve problem instances, our method instead uses them to predict hyperparameters for a heuristic solver. The model is trained in a supervised fashion on a small dataset of graphs, with corresponding hyperparameters obtained through conventional hyperparameter optimization routines. During inference, the model predicts near-optimal hyperparameters for unseen instances that minimize the runtime of the heuristic solver. Experiments show that our method generalizes well to much larger graphs, and outperforms manually hand-tuned parameters. The framework is flexible and can be applied to a wide variety of combinatorial optimization problems or heuristic solvers.
Submission Number: 68
Loading