Eye Gaze and Self-attention: How Humans and Transformers Attend Words in SentencesDownload PDF

Anonymous

16 Nov 2021 (modified: 09 Aug 2023)ACL ARR 2021 November Blind SubmissionReaders: Everyone
Abstract: Attention mechanisms are used to describe human reading processes and natural language processing by transformer neural networks. On the surface, attention appears to be very different under these two contexts. However, this paper presents evidence that there are links between the two during reading tasks.During reading, the dwell times of human eye movements were strongly correlated with the attention patterns occurring in the early layers of pre-trained transformers such as BERT. Furthermore, we explored what factors lead to variations in these correlations and observed that data were more correlated when humans read for comprehension than when they were searching for specific information. Additionally, the strength of a correlation was not related to number of parameters within a transformer.
0 Replies

Loading