Keywords: in-context learning, transformers, linear regression
TL;DR: We examine the impact of transformer model capacity on the estimator learned within a statistical framework.
Abstract: This paper investigates the relationship between model capacity and the emergence of in-context learning under a simplified statistical framework in the transformer model. When model capacity is restricted enough, transformers shift from learning the Bayes optimal estimator for the training task distribution to an estimator that is suitable for out-of-distribution tasks. This shift is attributed to the restricted model's inability to fully memorize the training task distribution. Further experiments examine how the transformer's hyper-parameters impact its capacity for memorization.
Submission Number: 80
Loading