Stacked co-training for semi-supervised multi-label learning

Published: 01 Jan 2024, Last Modified: 02 May 2025Inf. Sci. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Due to the difficulty of annotation, multi-label learning sometimes obtains a small amount of labeled data and a large amount of unlabeled data as supplements. To make up this issue, many algorithms extended the existing semi-supervised strategies in single-label patterns to multi-label applications, but failed to effectively consider the characteristics of semi-supervised multi-label learning. In this paper, a novel method named SCTML (Stacked Co-Training for Multi-Label learning) is proposed for semi-supervised multi-label learning. Through a two-layer stacking framework, SCTML learns label correlation in both base learners and meta learner, and effectively incorporates the semi-supervised assumptions of co-training, clustering and manifold. Extensive experiments demonstrate that the combination of multiple semi-supervised learning strategies effectively solves the semi-supervised multi-label learning problem.
Loading