Joint contrastive learning for prompt-based few-shot language learners

Published: 01 Jan 2024, Last Modified: 16 May 2025Neural Comput. Appl. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: The combination of prompt learning and contrastive learning has recently been a promising approach to few-shot learning in NLP field. However, most of these studies only focus on the semantic-level relevance and intra-class information of data in the class level while ignoring the importance of fine-grained instance-level feature representations. This paper proposes a joint contrastive learning (JCL) framework that leverages instance-level contrastive learning to learn fine-grained differences of feature representations and class-level contrastive learning to learn richer intra-class information. The experimental results demonstrate that the proposed JCL method is effective and has strong generalization ability. Our code is available at https://github.com/2251821381/JCL.
Loading