Abstract: Antonym detection is a vital task in NLP systems. Pattern-based methods, typical solutions for this, recognize semantic relationships between words using given patterns but have limited performance. Distributed word embeddings often struggle to distinguish antonyms from synonyms because their representations rely on local co-occurrences in similar contexts. Combining the ambiguity of Chinese and the contradictory nature of antonyms, antonym detection faces unique challenges. In this paper, we propose a word-sememe graph to integrate relationships between sememes and Chinese words, organized as a 4-partite graph. We design a heuristic sememe relevance computation as a supplementary measure and develop a relation inference scheme using related sememes as taxonomic information to leverage the relational transitivity. The 4-partite graph can be extended based on this scheme. We introduce the R elation D iscriminated L earning based on S ememe A ttention (RDLSA) model, employing three attention strategies on sememes to learn flexible entity representations. Antonym relations are detected using a Link Prediction approach with these embeddings. Our method demonstrates superior performance in Triple Classification and Chinese Antonym Detection compared to the baselines. Experimental results show reduced ambiguity and improved antonym detection using linguistic sememes. A quantitative ablation analysis further confirms our scheme’s effectiveness in capturing antonyms.
Loading