Neurons learn slower than they thinkDownload PDF

Anonymous

13 Mar 2021 (modified: 05 May 2023)Submitted to Learning to Learn 2021Readers: Everyone
Keywords: explainable and reliable AI, optimization, convergence rate, generalization error, differential capability, item response theory
TL;DR: Neurons learn slower than they think
Abstract: Recent studies revealed complex convergence dynamics in gradient-based methods, which has been little understood so far. Changing the step size to balance between high convergence rate and small generalization error may not be sufficient: maximizing the test accuracy usually requires a larger learning rate than minimizing the training loss. To explore the dynamic bounds of convergence rate, this study introduces \textit{differential capability} into an optimization process, which measures whether the test accuracy increases as fast as a model approaches the decision boundary in a classification problem. The convergence analysis showed that: 1) a higher convergence rate leads to slower capability growth; 2) a lower convergence rate results in faster capability growth and decay; 3) regulating a convergence rate in either direction reduces differential capability.
Proposed Reviewers: Dr. Tarzan Legović (email: tlegovic@oikon.hr), Dr. Eva Cetinić (email: ecetinic@irb.hr)
0 Replies

Loading