Backpropagation Through States: Training Neural Networks with Sequentially Semiseparable Weight MatricesOpen Website

Published: 01 Jan 2022, Last Modified: 15 May 2023EPIA 2022Readers: Everyone
Abstract: Matrix-Vector multiplications usually represent the dominant part of computational operations needed to propagate information through a neural network. This number of operations can be reduced if the weight matrices are structured. In this paper, we introduce a training algorithm for neural networks with sequentially semiseparable weight matrices based on the backpropagation algorithm. By exploiting the structures in the weight matrices, the computational complexity for computing the matrix-vector product can be reduced to the subquadratic domain. We show that this can lead to computing time reductions on a microcontroller. Furthermore, we analyze the generalization capabilities of neural networks with sequentially semiseparable matrices. Our experiments show that neural networks with structured weight matrices can outperform standard feed-forward neural networks in terms of test prediction accuracy for several real-world datasets.
0 Replies

Loading