Noise-tolerant learning as selection among deterministic grammatical hypotheses

Published: 15 Jun 2023, Last Modified: 05 Dec 2024Proceedings of the Society for Computation in Linguistics 2023EveryoneCC BY 4.0
Abstract: Children acquire their language’s canonical word order from data that contains a messy mixture of canonical and non-canonical clause types. We model this as noise-tolerant learning of grammars that deterministically produce a single word order. In simulations on English and French, our model successfully separates signal from the noise introduced by non-canonical clause types, in order to identify that both languages are SVO. No such preference for the target word order emerges from a comparison model which operates with a fully-gradient hypothesis space and an explicit numerical regularization bias. This provides an alternative general mechanism for regularization in various learning domains, whereby tendencies to regularize emerge from a learner’s expectation that the data are a noisy realization of a deterministic underlying system.
Loading