Keywords: Vector Symbolic Architectures, Holographic Reduced Representations, Hadamard Transform, HRR, VTB, MAP, HLB
TL;DR: Starting from the Hadamard transform we develop a simple method for neuro-symbolic manipulation of vectors that has desirable properties for deep learning.
Abstract: Vector Symbolic Architectures (VSAs) are one approach to developing Neuro-symbolic AI, where two vectors in $\mathbb{R}^d$ are 'bound' together to produce a new vector in the same space. VSAs support the commutativity and associativity of this binding operation, along with an inverse operation, allowing one to construct symbolic-style manipulations over real-valued vectors. Most VSAs were developed before deep learning and automatic differentiation became popular and instead focused on efficacy in hand-designed systems. In this work, we introduce the Hadamard-derived Linear Binding (HLB), which is designed to have favorable computational efficiency, efficacy in classic VSA tasks, and perform well in differentiable systems.
Submission Number: 56
Loading