Keywords: representation learning, embedding-based retrieval
Abstract: This paper studies the minimal dimension required to embed subset memberships ($m$ elements and ${m\choose k}$ subsets of at most $k$ elements) into vector spaces, denoted as Minimal Embeddable Dimension (MED).
The tight bounds of MED are derived theoretically and supported empirically for various notions of "distances" or "similarities", including $\ell_2$ metric, inner product, and cosine similarity.
In addition, we conduct numerical simulation in a more achievable setting, where the ${m\choose k}$ subset embeddings are chosen as the centroid of embeddings of the contained elements. Our simulation easily realizes a logarithmic dependency between the MED and the number of elements to embed.
These findings imply that embedding-based retrieval limitations stem primarily from learnability challenges, not geometric constraints, guiding future algorithm design.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 4333
Loading