Language Models and Semantic Relations: A Dual Relationship

Published: 01 Jan 2024, Last Modified: 04 Mar 2025LREC/COLING 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Since they rely on the distributional hypothesis, static and contextual language models are closely linked to lexical semantic relations. In this paper, we exploit this link for enhancing a BERT model. More precisely, we propose to extract lexical semantic relations with two unsupervised methods, one based on a static language model, the other on a contextual model, and to inject the extracted relations into a BERT model for improving its semantic capabilities. Through various evaluations performed for English and focusing on semantic similarity at the word and sentence levels, we show the interest of this approach, allowing us to semantically enrich a BERT model without using any external semantic resource.
Loading