Keywords: Reimannian Manifold, Gradient Averaging, Objective Function
TL;DR: A new optimizer on Reimannian Submanifolds based on Gradient Averaging
Abstract: In this work, we introduce the notion of $\texttt{RGrad-Avg}$, a variant of Riemannian Gradient Descent (RGD) algorithm based on gradient averaging, $\texttt{Grad-Avg}$~\cite{Purkayastha2020AVO}. In the present work, we extend the notion of $\texttt{Grad-Avg}$ to Riemannian submanifolds. We further establish that under reasonable assumptions, the value of the objective function decreases with each iteration of $\texttt{RGrad-Avg}$ and validate the results on some benchmark datasets. Additionally, our findings suggest that $\texttt{RGrad-Avg}$ is comparable to classical Riemannian Gradient Descent for the chosen datasets.
Submission Number: 7
Loading