Abstract: Hyperdimensional Computing (HDC) is a brain-inspired,
lightweight computing paradigm that has shown great poten-
tial for inference on the edge and on emerging hardware tech-
nologies, achieving state-of-the-art accuracy on certain clas-
sification tasks. HDC classifiers are inherently error resilient
and support early termination of inference to approximate
classification results. Practitioners have developed heuristic
methods to terminate inference early for individual inputs,
reducing the computation of inference at the cost of accu-
racy. These techniques lack statistical guarantees and may
unacceptably degrade classification accuracy or terminate
inference later than is needed to obtain an accuracy result.
We present Omen, the first dynamic HDC optimizer that
uses inferential statistics to terminate inference early while
providing accuracy guarantees. To realize Omen, we develop
a statistical view of HDC that reframes HD computations as
statistical sampling and testing tasks, enabling the use of statis-
tical tests. We evaluate Omen on 19 benchmark instantiations
of four classification tasks. Omen is computationally efficient,
delivering up to 7.21–12.18× inference speed-ups over an
unoptimized baseline while only incurring a 0.0–0.7% drop
in accuracy. Omen outperforms heuristic methods, achiev-
ing an additional 0.04–5.85× inference speed-up over the
unoptimized baseline compared to heuristic methods while
maintaining higher or comparable accuracy.
Loading