Resonator Networks, 1: An Efficient Solution for Factoring High-Dimensional, Distributed Representations of Data Structures.
Abstract: The ability to encode and manipulate data structures with distributed neural representations could qualitatively enhance the capabilities of tra- ditional neural networks by supporting rule-based symbolic reasoning, a central property of cognition. Here we show how this may be accom- plished within the framework of Vector Symbolic Architectures (VSAs) (Plate, 1991; Gayler, 1998; Kanerva, 1996), whereby data structures are encoded by combining high-dimensional vectors with operations that together form an algebra on the space of distributed representations. In particular, we propose an efficient solution to a hard combinatorial search problem that arises when decoding elements of a VSA data struc- ture: the factorization of products of multiple codevectors. Our proposed algorithm, called a resonator network, is a new type of recurrent neu- ral network that interleaves VSA multiplication operations and pattern completion. We show in two examples—parsing of a tree-like data structure and parsing of a visual scene—how the factorization problem arises and how the resonator network can solve it. More broadly, res- onator networks open the possibility of applying VSAs to myriad artifi- cial intelligence problems in real-world domains. The companion article in this issue (Kent, Frady, Sommer, & Olshausen, 2020) presents a rigor- ous analysis and evaluation of the performance of resonator networks, showing it outperforms alternative approaches.
0 Replies
Loading