Fit The Right NP-Hard Problem: End-to-end Learning of Integer Programming ConstraintsDownload PDF

Published: 12 Dec 2020, Last Modified: 05 May 2023LMCA2020 OralReaders: Everyone
Keywords: integer programming, discrete optimization, hybrid architectures, learning constraints
TL;DR: We propose an end-to-end trainable architecture for deep graph matching that contains unmodified combinatorial solvers.
Abstract: Bridging logical and algorithmic reasoning with modern machine learning techniques is a fundamental challenge with potentially transformative impact. On the algorithmic side, many NP-Hard problems can be expressed as integer programs, in which the constraints play the role of their ``combinatorial specification''. In this work, we aim to integrate integer programming solvers into neural network architectures by providing loss functions for \emph{both} the objective and the constraints. The resulting end-to-end trainable architectures have the power of jointly extracting features from raw data and of solving a suitable (learned) combinatorial problem with state-of-the-art integer programming solvers. We experimentally validate our approach on artificial datasets created from random constraints, and on solving \textsc{Knapsack} instances from their description in natural language.
1 Reply

Loading