Keywords: Federated Learning; Constraints-Aware; Constrained optimization; Lagrangian dual methods; On-Device; Language Models; Optimization
TL;DR: We present CAFL (Constraint-Aware Federated Learning), a principled approach for multi-resource optimization in federated learning that simultaneously manages energy, communication, memory, and thermal constraints through dual ascent.
Abstract: We present CAFL (Constraint-Aware Federated Learning), a principled approach for multi-resource optimization in federated learning that simultaneously manages energy, communication, memory, and thermal constraints through dual ascent. Unlike existing methods that optimize primarily for convergence, CAFL formulates federated learning as a constrained optimization problem and employs Lagrangian dual methods to dynamically adapt layer freezing, compression levels, and batch sizing. We provide theoretical convergence guarantees for our dual ascent controller and demonstrate that CAFL preserves training effectiveness through token-budget preservation while achieving significant resource savings. Experimental results on character-level language modeling demonstrate a 70.5\% reduction in energy consumption, 95.3\% lower communication cost, and 23\% memory savings compared to FedAvg, while maintaining comparable convergence in training loss.
Primary Area: optimization
Submission Number: 24519
Loading