Language and Experience: A Computational Model of Social Learning in Complex Tasks

ICLR 2026 Conference Submission17114 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: cognitive science; social learning; cultural learning; causal learning; bayesian models of cognition
TL;DR: Modeling human social and cultural learning as the joint inference of world models from language and experience, enabling cross-embodiment knowledge transfer
Abstract: The ability to combine linguistic guidance from others with direct experience is central to human development, enabling safe and rapid learning in new environments. How do people integrate these two sources of knowledge, and how might AI systems? We present a computational framework that models human social learning as joint probabilistic inference over structured, executable world models given sensorimotor and linguistic data. We make this possible by turning a pretrained language model into a probabilistic model of how humans share advice conditioned on their beliefs, allowing our agents both to generate advice for others and to interpret linguistic input as evidence during Bayesian inference. Using behavioral experiments and simulations across 10 video games, we show how linguistic guidance can shape exploration and accelerate learning by reducing risky interactions and speeding up key discoveries in both humans and models. We further explore how knowledge can accumulate across generations through iterated learning experiments and demonstrate successful knowledge transfer between humans and models—revealing how structured, language-compatible representations might enable human-machine collaborative learning.
Primary Area: applications to neuroscience & cognitive science
Submission Number: 17114
Loading