StableMDS: A Novel Gradient Descent-Based Method for Stabilizing and Accelerating Weighted Multidimensional Scaling
Abstract: Multidimensional Scaling (MDS) is an essential technique in multivariate analysis, with Weighted MDS (WMDS) commonly employed for tasks such as dimensionality reduction and graph drawing. However, the optimization of WMDS poses significant challenges due to the highly non-convex nature of its objective function. Stress Majorization, a method classified under the Majorization-Minimization algorithm, is among the most widely used solvers for this problem because it guarantees non-increasing loss values during optimization, even with a non-convex objective function. Despite this advantage, Stress Majorization suffers from high computational complexity, specifically $\mathcal{O}(\max(n^3, n^2 p))$ per iteration, where $n$ denotes the number of data points, and $p$ represents the projection dimension, rendering it impractical for large-scale data analysis. To mitigate the computational challenge, we introduce StableMDS, a novel gradient descent-based method that reduces the computational complexity to $\mathcal{O}(n^2 p)$ per iteration. StableMDS achieves this computational efficiency by applying gradient descent independently to each point, thereby eliminating the need for costly matrix operations inherent in Stress Majorization. Furthermore, we theoretically ensure non-increasing loss values and optimization stability akin to Stress Majorization. These advancements not only enhance computational efficiency but also maintain stability, thereby broadening the applicability of WMDS to larger datasets.
Submission Number: 664
Loading