Extended Deep Submodular Functions

Published: 19 Dec 2024, Last Modified: 19 Dec 2024Accepted by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: We introduce a novel representation of monotone set functions called Extended Deep Submodular functions (EDSFs), which are neural network-representable. EDSFs serve as an extension of Deep Submodular Functions (DSFs), inheriting crucial properties from DSFs while addressing innate limitations. It is known that DSFs can represent a limiting subset of submodular functions. In contrast, we establish that EDSFs possess the capability to represent all monotone submodular functions, a notable enhancement compared to DSFs. Furthermore, our findings demonstrate that EDSFs can represent any monotone set function, indicating the family of EDSFs is equivalent to the family of all monotone set functions. Additionally, we prove that EDSFs maintain the concavity inherent in DSFs when the components of the input vector are non-negative real numbers—an essential feature in certain combinatorial optimization problems. Through extensive experiments, we demonstrate that EDSFs exhibit significantly lower empirical generalization error in representing and learning coverage and cut functions compared to existing baselines, such as DSFs, Deep Sets, and Set Transformers.
Submission Length: Long submission (more than 12 pages of main content)
Code: https://github.com/semohosseini/comb-auction
Assigned Action Editor: ~Jonathan_Scarlett1
Submission Number: 3374
Loading