Riemannian Optimization for Skip-Gram Negative SamplingDownload PDF

11 Apr 2025 (modified: 23 Mar 2025)Submitted to ICLR 2017Readers: Everyone
Abstract: Skip-Gram Negative Sampling (SGNS) word embedding model, well known by its implementation in "word2vec" software, is usually optimized by stochastic gradient descent. It can be shown that optimizing for SGNS objective can be viewed as an optimization problem of searching for a good matrix with the low-rank constraint. The most standard way to solve this type of problems is to apply Riemannian optimization framework to optimize the SGNS objective over the manifold of required low-rank matrices. In this paper, we propose an algorithm that optimizes SGNS objective using Riemannian optimization and demonstrates its superiority over popular competitors, such as the original method to train SGNS and SVD over SPPMI matrix.
TL;DR: We train word embeddings optimizing Skip-Gram Negative Sampling objective (known by word2vec) via Riemannian low-rank optimization framework
Conflicts: yandex-team.ru, skoltech.ru, skolkovotech.ru
Keywords: Natural language processing, Unsupervised Learning
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/riemannian-optimization-for-skip-gram/code)
10 Replies

Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview