Toggle navigation
OpenReview
.net
Login
×
Go to
ACL 2023
homepage
A Study on Knowledge Distillation from Weak Teacher for Scaling Up Pre-trained Language Models
Hayeon Lee
,
Rui Hou
,
Jongpil Kim
,
Davis Liang
,
Sung Ju Hwang
,
Alexander Min
Published: 01 Jan 2023, Last Modified: 03 Oct 2023
ACL (Findings) 2023
Readers:
Everyone
0 Replies
Loading