Compact Neural Networks based on the Multiscale Entanglement Renormalization Ansatz

Anonymous

Nov 03, 2017 (modified: Nov 03, 2017) ICLR 2018 Conference Blind Submission readers: everyone Show Bibtex
  • Abstract: The goal of this paper is to demonstrate a method for tensorizing neural networks based upon an efficient way of approximating scale invariant quantum states, the Multi-scale Entanglement Renormalization Ansatz (MERA). We employ MERA as a replacement for linear layers in a neural network and test this implementation on the CIFAR-10 dataset. The proposed method outperforms factorization using tensor trains, providing greater compression for the same level of accuracy and greater accuracy for the same level of compression. We demonstrate MERA-layers with 3900 times fewer parameters and a reduction in accuracy of less than 1% compared to the equivalent fully connected layers.
  • TL;DR: We replace the fully connected layers of a neural network with the multi-scale entanglement renormalization ansatz, a type of quantum operation which describes long range correlations.
  • Keywords: Neural Networks, Tensor Networks, Tensor Trains

Loading