Bidirectional Backpropagation for High-Capacity Blocking NetworksDownload PDFOpen Website

2021 (modified: 17 Apr 2023)ICMLA 2021Readers: Everyone
Abstract: The new bidirectional backpropagation algorithm helps blocking networks learn and recall large numbers of image patterns. Bidirectional backpropagation exploits backward-pass learning that ordinary unidirectional backpropagation ignores. The backward pass reveals a hidden regressor in classifiers since the input neurons are identity units. Blocking networks allow deep classifiers to learn and accurately recognize more patterns than the older classifiers that use softmax neurons at the output classification layer. Blocking networks use logistic neurons at the output layer of a block. They use random bipolar coding from the vertices of a hypercube rather than from the vertices of the simplex embedded in it as with l-in-K encoding. Bidirectional deep sweeps improved classification accuracy on the CIFAR-100 image data base and did so at little extra computational cost.
0 Replies

Loading