Emerging Grounded Shared Vocabularies Between Human and Machine, Inspired by Human Language EvolutionOpen Website

Published: 01 Jan 2022, Last Modified: 15 May 2023Frontiers Artif. Intell. 2022Readers: Everyone
Abstract: Building conversational AI systems has the goal to teach machines to understand human language and respond naturally. The most common way to train agents to produce and interpret natural language is currently by exposing them to large quantities of data. Although this has resulted in advances in many areas, these systems typically have little understanding of how language is related to the real world (Mordatch and Abbeel, 2018), known as the grounding problem. Also, most conversational agents are trained in isolation, while humans are social animals, deeply embedded in culture and surrounded by others. Complex human behaviors, like language, evolved in socio–cultural contexts and could not exist without a variety of minds using and transmitting these behaviors.To overcome this problem, researchers in Computational Linguistics have started modeling emerging communication setups, in which novel signals are created by interacting agents (Lazaridou et al., 2018; Mordatch and Abbeel, 2018; Chaabouni et al., 2019; ter Hoeve et al., 2021). However, the findings in such models do not always match what is found in similar experiments with humans, and features found in human language often do not emerge (Lazaridou et al., 2020).The mechanisms that influence the emergence of communication and linguistic structure have been studied in the field of Language Evolution. Although the precise origins of human language are widely debated, computer simulations (Boer, 2006; Kirby, 2017; Steel...
0 Replies

Loading