- Keywords: kernels, embeddings, neural network, spectral analysis, generalization, optimization
- TL;DR: Spectral analysis for understanding how different representations can improve optimization and generalization.
- Abstract: We extend the recent results of (Arora et al., 2019) by a spectral analysis of representations corresponding to kernel and neural embeddings. They showed that in a simple single layer network, the alignment of the labels to the eigenvectors of the corresponding Gram matrix determines both the convergence of the optimization during training as well as the generalization properties. We generalize their result to kernel and neural representations and show that these extensions improve both optimization and generalization of the basic setup studied in (Arora et al., 2019).