Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
An Information-Theoretic Framework for Fast and Robust Unsupervised Learning via Neural Population Infomax
Wentao Huang, Kechen Zhang
Nov 05, 2016 (modified: Feb 09, 2017)ICLR 2017 conference submissionreaders: everyone
Abstract:A framework is presented for unsupervised learning of representations based on infomax principle for large-scale neural populations. We use an asymptotic approximation to the Shannon's mutual information for a large neural population to demonstrate that a good initial approximation to the global information-theoretic optimum can be obtained by a hierarchical infomax method. Starting from the initial solution, an efficient algorithm based on gradient descent of the final objective function is proposed to learn representations from the input datasets, and the method works for complete, overcomplete, and undercomplete bases. As confirmed by numerical experiments, our method is robust and highly efficient for extracting salient features from input datasets. Compared with the main existing methods, our algorithm has a distinct advantage in both the training speed and the robustness of unsupervised representation learning. Furthermore, the proposed method is easily extended to the supervised or unsupervised model for training deep structure networks.
TL;DR:We present a novel information-theoretic framework for fast and robust unsupervised Learning via information maximization for neural population coding.
Keywords:Unsupervised Learning, Theory, Deep learning
Enter your feedback below and we'll get back to you as soon as possible.