Learning Dynamic BERT via Trainable Gate Variables and a Bi-modal RegularizerDownload PDFOpen Website

2021 (modified: 29 Jan 2023)CoRR 2021Readers: Everyone
Abstract: The BERT model has shown significant success on various natural language processing tasks. However, due to the heavy model size and high computational cost, the model suffers from high latency, which is fatal to its deployments on resource-limited devices. To tackle this problem, we propose a dynamic inference method on BERT via trainable gate variables applied on input tokens and a regularizer that has a bi-modal property. Our method shows reduced computational cost on the GLUE dataset with a minimal performance drop. Moreover, the model adjusts with a trade-off between performance and computational cost with the user-specified hyperparameter.
0 Replies

Loading