Continuous Tensor Relaxation for Finding Diverse Solutions in Combinatorial Optimization Problems

TMLR Paper4460 Authors

12 Mar 2025 (modified: 20 Mar 2025)Under review for TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: Finding the optimal solution is often the primary goal in combinatorial optimization (CO). However, real-world applications frequently require diverse solutions rather than a single optimum, particularly in two key scenarios. First, when directly handling constraints is challenging, penalties are incorporated into the cost function, reformulating the problem as an unconstrained CO problem. Tuning these penalties to obtain a desirable solution is often time-consuming. Second, the optimal solution may lack practical relevance when the cost function or constraints only approximate a more complex real-world problem. To address these challenges, generating (i) penalty-diversified solutions by varying penalty intensities and (ii) variation-diversified solutions with distinct structural characteristics provides valuable insights, enabling practitioners to post-select the most suitable solution for their specific needs. However, efficiently discovering these diverse solutions is more challenging than finding a single optimal one. This study introduces Continual Tensor Relaxation Annealing (CTRA), a computationally efficient framework for unsupervised-learning (UL)-based CO solvers that generates diverse solutions within a single training run. CTRA leverages representation learning and parallelization to automatically discover shared representations, substantially accelerating the search for these diverse solutions. Numerical experiments demonstrate that CTRA outperforms existing UL-based solvers in generating these diverse solutions while significantly reducing computational costs.
Submission Length: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Ruqi_Zhang1
Submission Number: 4460
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview