Capacity Analysis of Vector Symbolic Architectures

22 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: representation learning for computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Hyperdimensional computing, Vector Symbolic Architectures, representation learning, sketching, dimensionality reduction
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: Hyperdimensional computing (HDC) is a biologically-inspired framework which represents symbols with high-dimensional vectors, and uses vector operations to manipulate them. The ensemble of a particular vector space and a prescribed set of vector operations (e.g., addition-like for "bundling" and outer-product-like for "binding") form a vector symbolic architecture (VSA). While VSAs have been employed in numerous learning applications and have been studied empirically, many theoretical questions about VSAs remain open. In this paper, we analyze the representation capacities of four common VSAs: MAP-I, MAP-B, and two VSAs based on sparse binary vectors. "Representation capacity" here refers to bounds on the dimensions of the VSA vectors required to perform certain symbolic tasks, such as testing for set membership and estimating set intersection sizes for two sets of symbols, to a given degree of accuracy. We also analyze the ability of a novel variant of a Hopfield network (a simple model of associative memory) to perform some of the same tasks that are typically asked of VSAs. In addition to providing new bounds on VSA capacities, our analyses establish and leverage connections between VSAs, "sketching" (dimensionality reduction) algorithms, and Bloom filters.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 6398
Loading