Finding Meaning in Embeddings: Concept Separation Curves

ACL ARR 2026 January Submission617 Authors

23 Dec 2025 (modified: 20 Mar 2026)ACL ARR 2026 January SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Embedding, Permutation, Understanding
Abstract: Sentence embedding techniques aim to encode key concepts of a sentence’s meaning in a vector space. However, the majority of evaluation approaches for sentence embedding quality rely on the use of additional classifiers or downstream tasks. These additional components make it unclear whether good results stem from the embedding itself or from the classifier's behaviour. In this paper, we propose a novel method for evaluating the effectiveness of sentence embedding methods in capturing sentence-level concepts. Our approach is classifier-independent and language-agnostic, allowing for an objective assessment of the model's performance. The approach adopted in this study involves the systematic introduction of syntactic noise and semantic negations into sentences, with the subsequent quantification of their relative effects on the resulting embeddings. The visualisation of these effects is facilitated by Concept Separation Curves, which show the model's capacity to differentiate between conceptual and surface-level variations. By leveraging data from multiple domains, employing both Dutch and English languages, and examining sentence lengths, this study offers a compelling demonstration that Concept Separation Curves provide an interpretable, reproducible, and cross-model approach for evaluating the conceptual stability of sentence embeddings.
Paper Type: Long
Research Area: Interpretability and Analysis of Models for NLP
Research Area Keywords: metrics, phrase/sentence embedding, benchmarking
Contribution Types: Model analysis & interpretability
Languages Studied: English, Dutch
Submission Number: 617
Loading