Federated Boolean Neural Networks Learning

Published: 01 Jan 2023, Last Modified: 15 May 2024FMEC 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: In this paper, we propose a new centralized Federated Learning (FL) for training Deep Neural Networks (DNNs) in resource-constrained environments. Despite its popularity, federated learning faces the increasingly difficult task of scaling communication over large wireless networks with limited bandwidth. Moreover, this distributed training paradigm requires clients to perform intensive computations for multiple iterations, which may exceed the capacity of a typical edge device with limited processing power, storage capacity, and energy budget. Therefore, practical deployment of FL requires a balance between energy efficiency due to resource constraints and latency due to bandwidth constraints. In this work, we overcome both constraints by integrating low-precision arithmetic on clients and exchanging only highly compressed vectors during training. Experimental results show that the proposed algorithms FedBool and MajBool perform better than current methods on standard image classification tasks.
Loading