Information-theoretical Complexity MetricsDownload PDF

20 Oct 2023OpenReview Archive Direct UploadReaders: Everyone
Abstract: Information-theoretical complexity metrics are auxiliary hypotheses that link theories of parsing and grammar to potentially observable measurements such as reading times and neural signals. This review article considers two such metrics, Surprisal and Entropy Reduction, which are respectively built upon the two most natural notions of ‘information value’ for an observed event (Blachman 1968). This review sketches their conceptual background and touches on their relationship to other theories in cognitive science. It characterizes them as ‘lenses’ through which theorists ‘see’ the information-processing conse- quences of linguistic grammars. While these metrics are not themselves parsing algorithms, the review identifies candidate mechanisms that have been proposed for both of them.
0 Replies

Loading