The Effect of Model Capacity on the Emergence of In-Context Learning

ICLR 2024 Workshop ME-FoMo Submission80 Authors

Published: 04 Mar 2024, Last Modified: 05 May 2024ME-FoMo 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: in-context learning, transformers, linear regression
TL;DR: We examine the impact of transformer model capacity on the estimator learned within a statistical framework.
Abstract: This paper investigates the relationship between model capacity and the emergence of in-context learning under a simplified statistical framework in the transformer model. When model capacity is restricted enough, transformers shift from learning the Bayes optimal estimator for the training task distribution to an estimator that is suitable for out-of-distribution tasks. This shift is attributed to the restricted model's inability to fully memorize the training task distribution. Further experiments examine how the transformer's hyper-parameters impact its capacity for memorization.
Submission Number: 80
Loading