Modeling Cue-based retrieval and prediction based on morpheme-level cues

Published: 03 Oct 2025, Last Modified: 13 Nov 2025CPL 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: memory retrieval, prediction, ACT-R, morpheme-level cues, cue-based retrieval, pronouns
TL;DR: We propose an ACT-R model that explains the processing of German possessive pronouns using morpheme-level cues to investigate the interaction between memory retrieval and prediction.
Abstract: Background Cue-based retrieval models typically assume word-by-word sentence processing, meaning that memory retrieval is triggered by cues associated with whole words. [1] developed an ACT-R model of the processing German possessive pronouns, which involves both antecedent retrieval and predictive processing. In their model, both antecedent retrieval and prediction were implemented as cue-based retrievals, triggered by word-level features. The model was designed to explain patterns observed in a visual world eye-tracking study, where participants heard instructions (as in Fig. 1) presented alongside a visual display containing two gender-mismatching objects. The model successfully captured the timing of prediction onsets observed in human participants. Critically, this paradigm involved MATCH and MISMATCH conditions, depending on whether the gender of the antecedent and the possessee matched. Empirically, prediction onsets were de- layed in MISMATCH conditions compared to MATCH conditions, reflecting an interaction between memory retrieval and predictive processing [2]. The model explained this interaction by similarity-based interference during antecedent retrieval influencing later picture prediction. However, when the same model was extended to a condition involving indefinite determiners, which requires prediction but no antecedent retrieval, it failed to reproduce the observed pattern of prediction onsets [3]. Importantly, experimental evidence suggests that com- prehenders do not treat inflected words as unanalyzed wholes but instead decompose them into morphemes [4, 5]. We propose an ACT-R model that operates on a morpheme level such that antecedent retrieval and prediction are trigger by morpheme-level cues. Method We extended the ACT-R model developed by [1] to model the prediction onsets observed in [3]. Unlike the original, our model parses the possessive pronoun morpheme-by-morpheme. This fine-grained processing allows it to encode retrieval cues, complete antecedent retrieval, and generate predictions based on different morphemes in a serial way within the same word. More specifically, in the MATCH and MISMATCH conditions, the stem triggers lexical processing, antecedent retrieval, and picture prediction. Antecedent retrieval uses gender (e.g., masculine for sein-, feminine for ihr-), animacy, and number cues to identify the possessor. Picture prediction based on the stem relies on gender, animacy, and number cues, reflecting the grammatical features of the upcoming noun. The cues that the model uses at this point are based on the suffix that it expects is most likely to follow. The suffix then undergoes lexical processing and initiates a second picture prediction that uses gender, animacy, and number reflecting the grammatical features of the upcoming noun. In the DETERMINER condition, the model follows the same steps but omits antecedent retrieval on the stem, resulting in fewer processing steps overall and avoiding similarity based interference from antecedent retrieval. Results The morpheme-level model shows an improved fit over the word-level model (see Fig. 2): it success- fully predicts the pattern of prediction onsets across both possessive pronouns and determiners. Discussion Our results suggest that morpheme-level processing offers a more precise account of how memory retrieval and prediction interact in sentence comprehension. However, this model was fitted post-hoc to the data from [3]. Future research needs to evaluate the model using novel data from broader behavioral responses, and carry out statistical model comparison.
Submission Number: 12
Loading