Anything Goes? A Crosslinguistic Study of (Im)possible Language Learning in LMs

ACL ARR 2025 February Submission2725 Authors

15 Feb 2025 (modified: 09 May 2025)ACL ARR 2025 February SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Do LLMs offer insights into human language learning? A common argument against this idea is that because their architecture and training paradigm are so vastly different from humans, LLMs can learn arbitrary inputs as easily as natural languages. In this paper, we test this claim by training LMs to model impossible or typologically unattested languages. Unlike previous work, which has focused exclusively on English, we conduct experiments on 12 natural languages from 4 language families. Our results show that while GPT-2 small can primarily distinguish attested languages from their impossible counterparts, it does not achieve perfect separation between all the possible languages and all the impossible ones. We further test whether GPT-2 small distinguishes typologically attested from unattested languages with different NP orders by manipulating word order based on Greenberg’s Universal 20 and find that the model's perplexity scores do not distinguish attested vs. unattested word orders, as long as the unattested variants maintain constituency structure. These findings suggest that language models exhibit some human-like inductive biases, though these biases are weaker than those found in human learners.
Paper Type: Long
Research Area: Linguistic theories, Cognitive Modeling and Psycholinguistics
Research Area Keywords: linguistic theories, cognitive modeling, computational psycholinguistics
Contribution Types: Model analysis & interpretability, Reproduction study, Data resources, Theory
Languages Studied: English, Chinese, Russian, Arabic, Turkish, German, Romanian,Dutch,Polish,Portuguese,Italian,Hebrew, Hungarian,Japanese, Korean,Indonesian,Greek,Persian,Lithuanian,Vietnamese,French,Spanish,Czech,Bulgarian,Slovak,Swedish,Serbian,Croatian,Ukrainian,Danish
Submission Number: 2725
Loading