FedCFC: On-Device Personalized Federated Learning with Closed-Form Continuous-Time Neural Networks

Published: 01 Jan 2024, Last Modified: 11 Nov 2024IPSN 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Closed-form continuous-time (CFC) neural networks have superior expressivity in modeling time series data compared with recurrent neural networks. CFC’s lower training and inference overheads also make it appealing for microcontroller-based platforms. This paper proposes FedCFC, which advances CFC from the centralized learning setting to the federated learning paradigm. FedCFC features a novel and communication-efficient aggregation strategy to address the problem of class distribution skews across clients’ training data. The strategy is designed based on a new empirical property of CFC identified in this paper, i.e., involatility of a sub-network of CFC with respect to training data’s class distribution. Extensive evaluation based on multiple time series datasets shows that FedCFC achieves higher or similar accuracy with 7.6× to 11× reduction in communication overhead, compared with recent federated learning approaches designed to address the class distribution skew problem. Implementations of FedCFC on four microcontroller platforms show its portability to low-end computing devices with 256kB memory and even less.
Loading