Globally Convergent Accelerated Algorithms for Multilinear Sparse Logistic Regression with ℓ-Constraints
Abstract: Multilinear logistic regression is a powerful tool for analyzing multidimensional data. To improve its efficiency and interpretability, we present a Multilinear Sparse Logistic Regression model with \(\ell_{0}\) norm constraints (\({{\ell}}_{0}\)-MSLR). In contrast to the \({{\ell}}_{1}\) norm and \({{\ell}}_{2}\) norm, the \({{\ell}}_{0}\) norm is more suitable for feature selection. However, due to its nonconvex and nonsmooth properties, solving \({{\ell}}_{0}\)-MSLR is challenging and convergence guarantees are lacking. Additionally, the multilinear operation in \({{\ell}}_{0}\)-MSLR also brings nonconvexity. To tackle these issues, we propose the Accelerated Proximal Alternating Linearization Method with Adaptive Momentum (APALM\({}^{+}\)) to solve the \({{\ell}}_{0}\)-MSLR. We prove that APALM\({}^{+}\) ensures the convergence of the \({{\ell}}_{0}\)-MSLR and provides a practical convergent scheme based on APALM\({}^{+}\). Furthermore, we prove the global convergence of APALM\({}^{+}\) and establish its convergence rate. To accelerate convergence, we also introduce an adaptive extrapolation strategy in APALM\({}^{+}\). The numerical experiments demonstrate the effectiveness of our method.
Loading