ReAugKD: Retrieval-Augmented Knowledge Distillation For Pre-trained Language ModelsDownload PDFOpen Website

Published: 01 Jan 2023, Last Modified: 25 Jul 2023ACL (2) 2023Readers: Everyone
Abstract: Jianyi Zhang, Aashiq Muhamed, Aditya Anantharaman, Guoyin Wang, Changyou Chen, Kai Zhong, Qingjun Cui, Yi Xu, Belinda Zeng, Trishul Chilimbi, Yiran Chen. Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers). 2023.
0 Replies

Loading