A Platform for Holistic Embodied Models of Infant Cognition, and Its Use in a Model of Event Processing
Abstract: Most computational models of infant cognition focus on selected components of the cognitive system. In this article, we present a model that is more holistic in aspiration: a computational model of a “whole baby,” incorporating a graphical simulation of the baby’s body, and a multicomponent brain model. The model is realistic and fast enough to interact in real time with human users playing a caregiver role. This allows us to directly compare the behavior of the simulated baby interacting with users with that of real babies interacting with parents via video chat using touchscreen devices. To illustrate the model, we present components of the cognitive model involved in processing events, and in representing events in working memory and long-term memory. We focus on the processing of motion events, where an object moves from one location to another. We model how the baby perceives such events, and also how the baby produces them, through motor actions. We also introduce our framework for recording and annotating interactions between real babies and their parents, and we describe a preliminary evaluation of our motion event model against manually identified motion events from these interactions.
Loading