On Robustness of the Normalized Random Block Coordinate Method for Non-Convex OptimizationDownload PDFOpen Website

2021 (modified: 27 May 2022)CDC 2021Readers: Everyone
Abstract: Large-scale optimization problems are usually characterized not only by large amounts of data points but points living in a high-dimensional space. Block coordinate methods allow for efficient implementations where steps can be made (block) coordinate-wise. Many existing algorithms rely on trustworthy gradient information and may fail to converge when such information becomes corrupted by possibly adversarial agents. We study the setting where the partial gradient with respect to each coordinate block is arbitrarily corrupted with some probability. We analyze the robustness properties of the normalized random block coordinate method (NRBCM) for non-convex optimization problems. We prove that NRBCM finds an $\mathcal{O}(1/\sqrt T )$-stationary point after T iterations if the corruption probabilities of partial gradients with respect to each block are below 1/2. With the additional assumption of gradient domination, faster rates are shown. Numerical evidence on a logistic classification problem supports our results.
0 Replies

Loading