bert2BERT: Towards Reusable Pretrained Language ModelsDownload PDFOpen Website

2022 (modified: 25 Apr 2023)ACL (1) 2022Readers: Everyone
Abstract: Cheng Chen, Yichun Yin, Lifeng Shang, Xin Jiang, Yujia Qin, Fengyu Wang, Zhi Wang, Xiao Chen, Zhiyuan Liu, Qun Liu. Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2022.
0 Replies

Loading