PreCog: Exploring the Relation between Memorization and Performance in Pre-trained Language ModelsDownload PDFOpen Website

Published: 01 Jan 2023, Last Modified: 03 Jun 2023CoRR 2023Readers: Everyone
Abstract: Pre-trained Language Models such as BERT are impressive machines with the ability to memorize, possibly generalized learning examples. We present here a small, focused contribution to the analysis of the interplay between memorization and performance of BERT in downstream tasks. We propose PreCog, a measure for evaluating memorization from pre-training, and we analyze its correlation with the BERT's performance. Our experiments show that highly memorized examples are better classified, suggesting memorization is an essential key to success for BERT.
0 Replies

Loading