The intriguing role of module criticality in the generalization of deep networksDownload PDF

25 Sep 2019 (modified: 11 Mar 2020)ICLR 2020 Conference Blind SubmissionReaders: Everyone
  • Original Pdf: pdf
  • TL;DR: We study the phenomenon that some modules of DNNs are more critical than others. Our analysis leads us to propose a complexity measure, that is able to explain the superior generalization performance of some architectures over others.
  • Abstract: We study the phenomenon that some modules of deep neural networks (DNNs) are more critical than others. Meaning that rewinding their parameter values back to initialization, while keeping other modules fixed at the trained parameters, results in a large drop in the network's performance. Our analysis reveals interesting properties of the loss landscape which leads us to propose a complexity measure, called module criticality, based on the shape of the valleys that connect the initial and final values of the module parameters. We formulate how generalization relates to the module criticality, and show that this measure is able to explain the superior generalization performance of some architectures over others, whereas, earlier measures fail to do so.
  • Keywords: Module Criticality Phenomenon, Complexity Measure, Deep Learning
10 Replies

Loading