Maximizing Submodular Functions under Submodular ConstraintsDownload PDF

Published: 08 May 2023, Last Modified: 26 Jun 2023UAI 2023Readers: Everyone
Keywords: submodular optimization, submodular constraint, data-dependent approximation, difference of submodular functions
Abstract: We consider the problem of maximizing submodular functions under submodular constraints by formulating the problem in two ways: SCSK-C and Diff-C. Given two submodular functions $f$ and $g$ where $f$ is monotone, the objective of SCSK-C problem is to find a set $S$ of size at most $k$ that maximizes $f(S)$ under the constraint that $g(S)\leq \theta$, for a given value of $\theta$. The problem of Diff-C focuses on finding a set $S$ of size at most $k$ such that $h(S) = f(S)-g(S)$ is maximized. It is known that these problems are highly inapproximable and do not admit any constant factor multiplicative approximation algorithms unless NP is easy. Known approximation algorithms involve data-dependent approximation factors that are not efficiently computable. We initiate a study of the design of approximation algorithms where the approximation factors are efficiently computable. For the problem of SCSK-C, we prove that the greedy algorithm produces a solution whose value is at least $(1-1/e)f(OPT) - A$, where $A$ is the data-dependent additive error. For the Diff-C problem, we design an algorithm that uses the SCSK-C greedy algorithm as a subroutine. This algorithm produces a solution whose value is at least $(1-1/e)h(OPT)-B$, where $B$ is also a data-dependent additive error. A salient feature of our approach is that the additive error terms can be computed efficiently, thus enabling us to ascertain the quality of the solutions produced.
Supplementary Material: pdf
Other Supplementary Material: zip
0 Replies

Loading