The optimization landscape of Spectral neural network

Published: 16 Jun 2024, Last Modified: 16 Jun 2024HiLD at ICML 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Spectral Neural Networks, Graph Laplacian, Riemannian optimization
TL;DR: We show that the ambient optimization landscape of spectral neural network is benign, and the parameterized landscape inherits from the benignness of ambient landscape if neural network is appropriately overparameterized.
Abstract: There is a large variety of machine learning methodologies that are based on the extraction of spectral geometric information from data. However, the implementations of many of these methods often depend on traditional eigensolvers, which present limitations when applied in practical online big data scenarios. To address some of these challenges, researchers have proposed different strategies for training neural networks as alternatives to traditional eigensolvers, with one such approach known as Spectral Neural Network (SNN). In this paper, we initiate a theoretical exploration of the optimization landscape of SNN's objective to shed light on the training dynamics of SNN. Unlike typical studies of convergence to global solutions of NN training dynamics, SNN presents an additional complexity due to its non-convex ambient loss function, a feature that is common in unsupervised learning settings. We show that the ambient optimization landscape is benign in a quotient geometry. Furthermore, we use the experimental results to see that the parameterized optimization landscape inherits from the benignness of the ambient landscape if the neural network is appropriately overparameterized.
Student Paper: Yes
Submission Number: 49
Loading