Word2HyperVec: From Word Embeddings to Hypervectors for Hyperdimensional Computing

Published: 01 Jan 2024, Last Modified: 25 Dec 2024ACM Great Lakes Symposium on VLSI 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Word-aware sentiment analysis has posed a significant challenge over the past decade. Despite the considerable efforts of recent language models, achieving a lightweight representation suitable for deployment on resource-constrained edge devices remains a crucial concern. This study proposes a novel solution by merging two emerging paradigms, the Word2Vec language model and Hyperdimensional Computing, and introduces an innovative framework named Word2HyperVec. Our framework prioritizes model size and facilitates low-power processing during inference by incorporating embeddings into a binary space. Our solution demonstrates significant advantages, consuming only 2.2 W, up to 1.81 × more efficient than alternative learning models such as support vector machines, random forest, and multi-layer perceptron.
Loading