Archetypal Analysis++: Rethinking the Initialization Strategy

TMLR Paper2279 Authors

22 Feb 2024 (modified: 22 Apr 2024)Decision pending for TMLREveryoneRevisionsBibTeX
Abstract: Archetypal analysis is a matrix factorization method with convexity constraints. Due to local minima, a good initialization is essential, but frequently used initialization methods yield either sub-optimal starting points or are prone to get stuck in poor local minima. In this paper, we propose archetypal analysis++ (AA++), a probabilistic initialization strategy for archetypal analysis that sequentially samples points based on their influence on the objective function, similar to $k$-means++. In fact, we argue that $k$-means++ already approximates the proposed initialization method. Furthermore, we suggest to adapt an efficient Monte Carlo approximation of $k$-means++ to AA++. In an extensive empirical evaluation of 15 real-world data sets of varying sizes and dimensionalities and considering two pre-processing strategies, we show that AA++ almost always outperforms all baselines, including the most frequently used ones.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: We incorporated the feedback of the reviewers. Specifically, we - included a new data set with more than 500 dimensions and - added Section 6 named "Discussion & Limitations" in which we - describe why theoretical guarantees similar to $k$-means++ are difficult to derive; - discuss the limitation of the initialization time; - explain why the total runtime of the initialization followed by 30 iterations of archetypal analysis varies; and - discuss why, on rare occasions, the loss slightly increases.
Code: https://github.com/smair/archetypalanalysis-initialization
Assigned Action Editor: ~Benjamin_Guedj1
Submission Number: 2279
Loading