Exploring Diverse Solutions for Underdetermined Problems

Published: 10 Jun 2025, Last Modified: 15 Jul 2025MOSS@ICML2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: theory-informed learning, data-free, diversity, mode collapse
TL;DR: We showcase a novel diversity loss that effectively enables theory-informed generative models to explore and produce multiple distinct solutions in data-free settings.
Abstract: This work explores the utility of a recently proposed diversity loss in training generative, theory-informed models on underdetermined problems with multiple solutions. Unlike data-driven methods, theory-informed learning often operates in data-free settings, optimizing neural networks to satisfy objectives and constraints. We demonstrate how this diversity loss encourages the generation of diverse solutions across various example problems, effectively avoiding mode collapse and enabling exploration of the solution space.
Code: zip
Submission Number: 33
Loading