Exploring Contrastive Learning for Long-Tailed Multi-Label Text ClassificationDownload PDF

Anonymous

16 Dec 2023ACL ARR 2023 December Blind SubmissionReaders: Everyone
TL;DR: Exploring Contrastive Learning for Long-Tailed Multi-Label Text Classification
Abstract: Learning an effective representation in multi-label text classification (MLTC) presents a significant challenge in NLP. This challenge emerges due to the inherent complexity of the task and is shaped by two key factors: the intricate interconnections among labels and the widespread long-tailed distribution of data. In order to overcome this major issue, one potential approach involves the integration of supervised contrastive learning with classical supervised loss functions. Although contrastive learning has shown remarkable performance in multi-class classification, its impact in the multi-label framework has not been thoroughly examined. In this paper, we conduct an in-depth study of supervised contrastive learning and its influence on representation in MLTC context. We emphasize the significance of taking into account long-tailed data distributions to establish a resilient representation space, effectively tackling two critical challenges associated with contrastive learning: the "lack of positives" and "attraction-repulsion imbalance". Building on this insight, we introduce a novel contrastive loss function for MLTC. It attains Micro-F1 scores that either match or surpass those obtained with other frequently employed loss functions, and demonstrates a significant improvement in Macro-F1 scores across three multi-label datasets.
Paper Type: long
Research Area: Machine Learning for NLP
Contribution Types: Theory
Languages Studied: English
0 Replies

Loading