Age and Language Experience Modulate Predictive Processing in the Visual Modality

Published: 03 Oct 2025, Last Modified: 13 Nov 2025CPL 2025 SpotlightPosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: predictive processing, sign language, neural coherence, EEG, aging, visual language
TL;DR: Lifelong sign language experience refines predictive neural processing in aging brains: EEG shows that visual linguistic coherence increases with age, despite broader neural noise.
Abstract: Across the lifespan, the human brain undergoes functional reorganization that reflects both developmental adaptation and experience-dependent change. One of the central goals of neuroscience is to understand how the brain anticipates and adapts to structured input. Predictive processing frameworks suggest that the brain continuously generates expectations about incoming sensory data: thus, perception is shaped not only by incoming stimuli, but also by top-down expectations grounded in past exposure. Language comprehension, as a high-level cognitive function, provides a natural domain in which to examine such predictive mechanisms. Sign languages offer a unique testbed for studying experience-dependent predictive processing in the visual modality, independent of auditory processing. Deaf signers acquire language through vision and develop highly tuned sensitivity to temporal dynamics of visual motion. Native Deaf signers allow us to examine how structured, dynamic visual input over a lifetime affects cortical organization. Signed language signal consists of a highly structured visual stream, where timing and kinematics are tightly linked to linguistic meaning. Understanding how the brain changes processing of visual cues for predictive processing over the lifespan is crucial for modeling predictive mechanisms in computational neuroscience. We used EEG recordings from Deaf native signers of Austrian Sign Language (ÖGS) to examine how predictive neural dynamics vary with age and language experience. Participants (N = 24, age range 28–68) viewed natural signed sentences and time-reversed non-linguistic motion controls. Neural signals were analyzed using spectral complexity metrics and coherence with optical flow (a dynamic visual input measure reflecting temporal structure of sign language). We observed no significant Age × Stimulus interactions in any spectral measure across cortical regions. This dissociation suggests that the influence of age on EEG spectral features—such as increased spectral flatness—is likely structural in nature, reflecting general neurophysiological changes associated with aging (e.g., reduced neural differentiation). Additionaly, a machine learning pipeline identified EEG features most diagnostic of signed versus reversed input across all participants. In this analysis, age effects were modulated by brain region and condition: in non-linguistic reversed videos, older participants exhibited delayed entrainment in frontal regions, suggesting difficulty integrating unpredictable sensory input with internal models. Conversely, in the sign language condition, age was positively associated with posterior coherence strength, particularly at low frequencies associated with phrasal or supra-lexical timescales. This finding points to a refinement of predictive mechanisms with lifelong language experience. Our findings support the hypothesis that life-long visual language experience refines predictive coding in the brain. The transition from sensory-driven to model-driven inference appears to be reflected in frequency-specific coherence dynamics, particularly in frontal and posterior regions. These results expand current models of predictive processing by showing how sensory modality and linguistic experience jointly shape cortical aging trajectories.
Submission Number: 35
Loading