Keywords: loss functions, convex analysis, monotone operators
TL;DR: A new family of loss functions based on monotone operator theory that lower bound Fenchel-Young losses, such as the the logistic loss.
Abstract: Fenchel-Young losses are a family of loss functions, encompassing the squared,
logistic and sparsemax losses, among others. They are convex w.r.t. the model
output and the target, separately. Each Fenchel-Young loss is implicitly associated
with a link function, that maps model outputs to predictions. For instance, the
logistic loss is associated with the soft argmax link function. Can we build new
loss functions associated with the same link function as Fenchel-Young losses?
In this paper, we introduce Fitzpatrick losses, a new family of separately convex
loss functions based on the Fitzpatrick function. A well-known theoretical tool in
maximal monotone operator theory, the Fitzpatrick function naturally leads to a
refined Fenchel-Young inequality, making Fitzpatrick losses tighter than Fenchel-
Young losses, while maintaining the same link function for prediction. As an
example, we introduce the Fitzpatrick logistic loss and the Fitzpatrick sparsemax
loss, counterparts of the logistic and the sparsemax losses. This yields two new
tighter losses associated with the soft argmax and the sparse argmax, two of the
most ubiquitous output layers used in machine learning. We study in details the
properties of Fitzpatrick losses and, in particular, we show that they can be seen as
Fenchel-Young losses using a modified, target-dependent generating function. We
demonstrate the effectiveness of Fitzpatrick losses for label proportion estimation.
Primary Area: Optimization (convex and non-convex, discrete, stochastic, robust)
Submission Number: 7288
Loading