Representing Sentence Interpretations with Overlapping Box Embeddings

ACL ARR 2024 December Submission556 Authors

14 Dec 2024 (modified: 05 Feb 2025)ACL ARR 2024 December SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Most of the previous studies on sentence embeddings aim to obtain one representation per sentence. However, this approach is inadequate for handling the relations between sentences in cases where a sentence has multiple interpretations. To address this problem, we propose a novel concept, interpretation embeddings, which are the representations of the interpretations of a sentence. We propose GumbelCSE, which is a contrastive learning method for learning box embeddings of sentences. The interpretation embeddings are derived by measuring the overlap between the box embeddings of the target sentence and those of other sentences. We evaluate our method on four tasks: Recognizing Textual Entailment (RTE), Entailment Direction Prediction, Ambiguous RTE, and Conditional Semantic Textual Similarity (C-STS). In the RTE and Entailment Direction Prediction tasks, GumbelCSE outperforms other sentence embedding methods in most cases. In the Ambiguous RTE and C-STS tasks, it is demonstrated that the interpretation embeddings are effective in capturing the ambiguity of meaning inherent in a sentence.
Paper Type: Long
Research Area: Semantics: Lexical and Sentence-Level
Research Area Keywords: Machine Learning for NLP, Semantics: Lexical and Sentence-Level
Contribution Types: NLP engineering experiment
Languages Studied: English
Submission Number: 556
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview