AcTune: Uncertainty-Based Active Self-Training for Active Fine-Tuning of Pretrained Language ModelsDownload PDFOpen Website

2022 (modified: 14 Nov 2022)NAACL-HLT 2022Readers: Everyone
Abstract: Yue Yu, Lingkai Kong, Jieyu Zhang, Rongzhi Zhang, Chao Zhang. Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2022.
0 Replies

Loading