Tapping BERT for Preposition Sense DisambiguationDownload PDF

Anonymous

16 Jan 2022 (modified: 05 May 2023)ACL ARR 2022 January Blind SubmissionReaders: Everyone
Abstract: Prepositions are frequently occurring polysemous words. Disambiguation of prepositions is crucial in tasks like semantic role labelling, question answering, text entailment, and noun compound paraphrasing. In this paper, we propose a novel methodology for preposition sense disambiguation (PSD), which does not use any linguistic tools. In a supervised setting, the machine learning model is presented with sentences wherein prepositions have been annotated with 'senses'. These 'senses' are IDs in what is called 'The Preposition Project (TPP)'. We use the hidden layer representations from pre-trained BERT and its variants. The latent representations are then classified into the correct sense ID using a Multi-Layer Perceptron. The datasets used for this task are from SemEval-2007 Task-6 and Oxford English Corpus (OEC). Our methodology gives an accuracy of 86.85% on the SemEval task, which is better than the state-of-the-art.
Paper Type: short
0 Replies

Loading