Leveraged Weighted Loss For Partial Label LearningDownload PDF

28 Sept 2020 (modified: 05 May 2023)ICLR 2021 Conference Withdrawn SubmissionReaders: Everyone
Keywords: weakly supervised learning, loss function, risk consistency
Abstract: As an important branch of weakly supervised learning, partial label learning deals with data where each instance is assigned with a set of candidate labels, whereas only one of them is true. In this paper, we propose a family of loss functions named Leveraged Weighted (LW) loss function, which for the first time introduces the leverage parameter $\beta$ to partial loss functions to leverage between losses on partial labels and residual labels (non-partial labels). Under mild assumptions, we achieve the relationship between the partial loss function and its corresponding ordinary loss that leads to the consistency in risk. Compared to the existing literatures, our result applies to both deterministic and stochastic scenarios, considers the loss functions of a more general form, and takes milder assumptions on the distribution of the partial label set. As special cases, with $\beta = 1$ and $\beta = 2$, the corresponding ordinary losses of our LW loss respectively match the binary classification loss and the \textit{one-versus-all} (OVA) loss function. In this way, our theorems successfully explain the experimental results on parameter analysis, where $\beta = 1$ and especially $\beta = 2$ are considered as preferred choices for the leverage parameter $\beta$. Last but not least, real data comparisons show the high effectiveness of our LW loss over other state-of-the-art partial label learning algorithms.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
One-sentence Summary: This paper proposes a family of loss functions named Leveraged Weighted (LW) loss function, which introduces the leverage parameter, with both theoretical and experimental properties examined.
Supplementary Material: zip
Reviewed Version (pdf): https://openreview.net/references/pdf?id=u_WmFBOyax
5 Replies

Loading