Semantics is Actually 82% Distributional, but Neural Networks Aren't.Download PDF

Anonymous

16 Jan 2022 (modified: 05 May 2023)ACL ARR 2022 January Blind SubmissionReaders: Everyone
Abstract: Distributional semantics is often proposed as the linguistic theory underpinning many of the most efficient current NLP systems. In the present paper, we question the linguistic well-foundedness of these models, addressing it from the perspective of distributional substitution. To that end, we provide a dataset of human judgments on the distributional hypothesis, and highlight how humans cannot systematically distinguish pairs of words solely from contextual information. We stress that earlier static embedding architectures are competitive with more modern contextual embeddings on the distributional substitution task, and that neither serve as good models of human linguistic behavior.
Paper Type: long
0 Replies

Loading