AlphaTuning: Quantization-Aware Parameter-Efficient Adaptation of Large-Scale Pre-Trained Language ModelsDownload PDFOpen Website

2022 (modified: 18 Apr 2023)EMNLP (Findings) 2022Readers: Everyone
Abstract: Se Jung Kwon, Jeonghoon Kim, Jeongin Bae, Kang Min Yoo, Jin-Hwa Kim, Baeseong Park, Byeongwook Kim, Jung-Woo Ha, Nako Sung, Dongsoo Lee. Findings of the Association for Computational Linguistics: EMNLP 2022. 2022.
0 Replies

Loading