Learning Set Functions with Implicit Differentiation

Published: 27 Jun 2024, Last Modified: 20 Aug 2024Differentiable Almost EverythingEveryoneRevisionsBibTeXCC BY 4.0
Keywords: learning set functions, implicit differentiation, neural set functions, deep learning, probabilistic methods
TL;DR: We use implicit differentiation for efficiently learning set functions under the optimal subset oracle.
Abstract: Ou et al. [1] introduce the problem of learning set functions from data generated by a so-called optimal subset oracle. Their approach approximates the underlying utility function with an energy-based model. This approximation yields iterations of fixed-point updates during mean-field variational inference. However, as the number of iterations increases, automatic differentiation becomes computationally prohibitive due to the size of the Jacobians that are stacked during back-propagation. We address this challenge with implicit differentiation and examine the convergence conditions for the fixed-point iterations. We empirically demonstrate the efficiency of our method on subset selection applications including product recommendation and anomaly detection tasks.
Submission Number: 37
Loading