Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
Bottom Up or Top Down? Dynamics of Deep Representations via Canonical Correlation Analysis
Maithra Raghu, Jason Yosinski, Jascha Sohl-Dickstein
Feb 17, 2017 (modified: Mar 05, 2017)ICLR 2017 workshop submissionreaders: everyone
Abstract:We present a versatile quantitative framework for comparing representations in deep neural networks, based on Canonical Correlation Analysis, and use it to analyze the dynamics of representation learning during the training process of a deep network. We find that layers converge to their final representation from the bottom-up, but that the representations themselves migrate downwards in the net-work over the course of learning.
TL;DR:Use CCA to look at representation learning dynamics of neural networks, and find bottom up convergence, top down representation crawling.
Keywords:Theory, Deep learning
Conflicts:google.com, cs.cornell.edu, uber.com
Enter your feedback below and we'll get back to you as soon as possible.