Abstract: Highlights•Address challenges in fully asynchronous FL: trip asynchrony, local update drift, dynamic communication.•Propose a 1-bit feedback mechanism to dynamically regulate client trips and match their capabilities.•Present a sharpness-aware adversarial local update approach for resilient local model training.•Introduce a lightweight communication-aware dropout strategy for adaptive gradient compression.•Propose momentum-based asynchronous Federated optimization using accumulated gradients for optimal updates.
Loading