TPTCD: A Prompt Tuning based Two-Step Framework for Cross-Domain Text Classification

ACL ARR 2025 May Submission1930 Authors

18 May 2025 (modified: 03 Jul 2025)ACL ARR 2025 May SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: In recent years, with the rapid development of large models, prompt tuning has shown strong performance in cross-domain text classification. Nevertheless, it still faces the following two issues: (1) prompt tuning based methods usually do not align the same class of labels in different domains; and (2) they probably focus on some simple samples in the source domain, which may hinder the improvement of model’s adaptability. To alleviate these issues, we propose a new method, called Two-step Prompt Tuning with supervised Contrastive learning and feature Disentanglement (TPTCD). Specifically, to enable the model to align the same labels in different domains , we innovatively combine soft prompt tuning with supervised contrastive learning, leveraging their merits. Furthermore, to improve the domain adaptation ability, we propose a novel adversarial enhanced Variational Auto Encoder (VAE) for feature disentanglement, which enables the model to learn more effective features in both source and target domains. Extensive experiments based on benchmark datasets demonstrate the competitiveness of our proposed method, compared against state-of-the-art cross-domain text classification models.
Paper Type: Long
Research Area: Special Theme (conference specific)
Research Area Keywords: Efficient/Low-Resource Methods for NLP, Machine Learning for NLP, Sentiment Analysis, Stylistic Analysis, and Argument Mining
Contribution Types: NLP engineering experiment, Approaches to low-resource settings
Languages Studied: English
Submission Number: 1930
Loading