A Doubly Recursive Stochastic Compositional Gradient Descent Method for Federated Multi-Level Compositional Optimization

Published: 02 May 2024, Last Modified: 25 Jun 2024ICML 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Federated compositional optimization has been actively studied in the past few years. However, existing methods mainly focus on the two-level compositional optimization problem, which cannot be directly applied to the multi-level counterparts. Moreover, the convergence rate of existing federated two-level compositional optimization learning algorithms fails to achieve linear speedup with respect to the number of workers under heterogeneous settings. After identifying the reason for this failure, we developed a novel federated stochastic multi-level compositional optimization algorithm by introducing a novel Jacobian-vector product estimator. This innovation mitigates both the heterogeneity issue and the communication efficiency issue simultaneously. We then theoretically proved that our algorithm can achieve the level-independent and linear speedup convergence rate for nonconvex problems. To our knowledge, this is the first time that a federated learning algorithm can achieve such a favorable convergence rate for multi-level compositional problems. Moreover, experimental results confirm the efficacy of our algorithm.
Submission Number: 7652
Loading