Word sense generationDownload PDF

Anonymous

16 Oct 2022 (modified: 05 May 2023)ACL ARR 2022 October Blind SubmissionReaders: Everyone
Keywords: word sense, lexical creativity, polysemy, probabilistic modeling, contextualized language model, chaining, metonymy, metaphor
Abstract: The lexicon makes creative reuse of words to express novel senses. A long-standing effort in natural language processing has been focusing on disambiguating and inducing word senses from context. Little has been explored about how novel word senses may be generated automatically. We consider a paradigm of word sense generation (WSG) that enables words to spawn new senses by extending toward novel naturalistic context. We develop a general framework that simulates novel word sense extension by dividing a word into hypothetical child tokens and making inferences about the plausibility of sense extension among the sibling tokens in usage sentences that never appear in training. Our framework combines probabilistic models of chaining with a learning scheme that transforms a language model embedding space to support various types of word sense extensions. We evaluate our framework rigorously against several competitive baselines and show that it is superior in predicting plausible novel senses including metonymic and metaphoric word usages in a large set of 1,500 English verbs. We show that the learned semantic space exhibits systematic patterns of word sense extension while retaining competence in common natural language processing tasks.
Paper Type: long
Research Area: Semantics: Lexical
0 Replies

Loading