Using Ordinal Labels for Text Augmentation and Simultaneous Contrastive Learning for DownStream Task.

ACL ARR 2024 June Submission5107 Authors

16 Jun 2024 (modified: 02 Jul 2024)ACL ARR 2024 June SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Leveraging large language models, advancements in text augmentation and embedding models for downstream tasks have shown promise, However yet challenges remain in distinguishing texts with similar meanings. The proposed scheme, incorporating ordered labels to enhance sequence information, employs an integrated technique combining Contrastive and Downstream Learning. The proposed scheme outperforms Full Fine-Tuning methods using only classfication learning in text classification because it effectively uses ordered labels to train the model to distinguish similar texts with greater accuracy. our method boosts data diversity and model accuracy by refining the model's sensitivity to nuances, utilizing strongly hard-negative samples in generated texts to further enhance Contrastive Learning outcomes
Paper Type: Short
Research Area: NLP Applications
Research Area Keywords: NLP Applications, Machine Learning for NLP, Information Extraction
Contribution Types: Model analysis & interpretability, Data analysis, Position papers
Languages Studied: korean
Submission Number: 5107
Loading