Highly Parallel Deep Ensemble LearningDownload PDF

16 May 2022 (modified: 05 May 2023)NeurIPS 2022 SubmittedReaders: Everyone
Keywords: parallel, deep ensemble learning, spectral tensor
Abstract: In this paper, we propose a novel highly parallel deep ensemble learning, which leads to highly compact and parallel deep neural networks. The main idea is to first represent the data in tensor form, apply a linear transform along certain dimension and split the transformed data into different independent spectral data sets; then the matrix product in conventional neural networks is replaced by tensor product, which in effect imposes certain transformed-induced structure on the original weight matrices, e.g., a block-circulant structure. The key feature of the proposed spectral tensor network is that it consists of parallel branches with each branch being an independent neural network trained using one spectral subset of the training data. Besides, the joint data/model parallel amiable for GPU implementation. The outputs of the parallel branches, which are trained on different independent spectral, are combined for ensemble learning to produce an overall network with substantially stronger generalization capability than that of those parallel branches. Moreover, benefiting from the reducing size of inputs, the proposed spectral tensor network exhibits an inherent network compression, and as a result, reduction in computation complexity, which leads to the acceleration of training process. The high parallelism from the massive independent operations of the parallel spectral subnetworks enable a further acceleration in training and inference process. We evaluate the proposed spectral tensor networks on the MNIST, CIFAR-10 and ImageNet data sets, to highlight that they simultaneously achieve network compression, reduction in computation and parallel speedup.
TL;DR: A highly parallel deep ensemble neural network derived from the convolution theorem, which split into parallel branches on independent spectral datasets.
16 Replies

Loading