Multi-Objective Multi-Solution Transport

24 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: optimization
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Multi-Objective Optimization
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: In the realm of multi-objective optimization, we introduce ''Multi-objective multi-solution Transport (MosT)'', a novel solution for optimizing multiple objectives that employs multiple solutions. The essence lies in achieving diverse trade-offs among objectives, where each solution performs as a domain expert, focusing on specific objectives while collectively covering all of them. Traditional methods often struggle, especially when the number of objectives greatly outnumbers the number of solutions, leading to either subpar solutions or objectives that have been essentially ignored. MosT addresses this by formulating the problem as a bi-level optimization of weighted objectives, where the weights are defined by an optimal transport between the objectives and solutions. Our newly developed algorithm not only ensures theoretical convergence to various Pareto front solutions but is also adaptive to cases where objectives outnumber solutions. We further enhance its efficiency by introducing a solution-specialization curriculum. With proven applications in federated learning, fairness-accuracy trade-offs, and standard MOO benchmarks, MosT distinctly outperforms existing methods, delivering high-quality, diverse solutions that profile the entire Pareto frontier, thus ensuring balanced trade-offs across objectives.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 8707
Loading