Federated Smoothing ADMM for Quantile Regression with Non-Convex Sparse Penalties

Published: 01 Jan 2025, Last Modified: 05 Aug 2025ICASSP 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: In decentralized systems like the Internet of Things (IoT) and cyber-physical networks, where data are distributed across multiple nodes, ensuring accurate and robust data analysis is crucial. Existing methods for penalized quantile regression often struggle with asynchronous operations and multiple updates per node, leading to inconsistencies across these distributed nodes. To address these challenges, we propose the Federated Smoothing ADMM (FSAD) algorithm, which integrates non-convex sparse penalties – specifically, the minimax concave penalty (MCP) and smoothly clipped absolute deviation (SCAD)–to effectively identify significant predictors while retaining sparsity. By incorporating a total variation norm within a smoothing ADMM framework, FSAD supports asynchronous updates and ensures model consistency across nodes, thereby overcoming traditional convergence limitations in non-convex, federated settings. Our theoretical analysis provides rigorous convergence guarantees, and extensive simulations confirm that FSAD outperforms existing methods in terms of both accuracy and computational efficiency.
Loading