Abstract: Label distribution learning (LDL) is a general learning framework, which assigns
to an instance a distribution over a set of labels rather than a single label or multiple
labels. Current LDL methods have either restricted assumptions on the expression
form of the label distribution or limitations in representation learning, e.g., to
learn deep features in an end-to-end manner. This paper presents label distribution
learning forests (LDLFs) - a novel label distribution learning algorithm based on
differentiable decision trees, which have several advantages: 1) Decision trees
have the potential to model any general form of label distributions by a mixture
of leaf node predictions. 2) The learning of differentiable decision trees can be
combined with representation learning. We define a distribution-based loss function
for a forest, enabling all the trees to be learned jointly, and show that an update
function for leaf node predictions, which guarantees a strict decrease of the loss
function, can be derived by variational bounding. The effectiveness of the proposed
LDLFs is verified on several LDL tasks and a computer vision application, showing
significant improvements to the state-of-the-art LDL methods.
0 Replies
Loading