AALRSMF: An Adaptive Learning Rate Schedule for Matrix Factorization

Published: 01 Jan 2016, Last Modified: 20 Jan 2025APWeb (2) 2016EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Stochastic gradient descent (SGD) is an effective algorithm to solve matrix factorization problem. However, the performance of SGD depends critically on how learning rates are tuned over time. In this paper, we propose a novel per-dimension learning rate schedule called AALRSMF. This schedule relies on local gradients, requires no manual tunning of a global learning rate, and shows to be robust to the selection of hyper-parameters. The extensive experiments demonstrate that the proposed schedule shows promising results compared to existing ones on matrix factorization.
Loading