COPHTC: Contrastive Learning with Prompt Tuning for Hierarchical Text Classification

Published: 01 Jan 2024, Last Modified: 13 Nov 2024ICASSP 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Hierarchical Text Classification (HTC) is an essential yet challenging task in natural language processing (NLP) due to its complex label structure. Recently, a number of approaches have employed prompt learning in HTC, achieving noteworthy outcomes. However, prompt-based HTC does not further optimize the representation of samples based on label relationships, nor does it dynamically adjust the relative positions of sample pairs from the perspective of the embedding space. In this work, we propose a COntrastive-enhanced Promptbased model for HTC tasks (COPHTC). More specifically, we integrate contrastive learning into prompt tuning while employing momentum updates and a dynamic queue to offer a greater variety of positive and negative samples for the input texts. Extensive experiments on three public datasets verify the effectiveness of COPHTC.
Loading