Abstract: In real-world Industrial Internet of Things (IIoT) scenarios, due to the limited storage capacity of IIoT devices, fresh data continuously received by diverse devices will overwrite the outdated data and change the local data distribution. However, state-of-the-art studies have demonstrated that federated learning tends to focus on training with fresh data, and the latest global model may forget the historical update directions (i.e., catastrophic forgetting). This issue can significantly degrade the global model accuracy. Existing methods primarily focus on integrating outdated data characteristics into fresh data but overlook the large parameter update gap between global and local models during global aggregation. This gap can cause the global model updates to deviate from the optimal direction. To this end, we propose a federated adaptive weighted aggregation method based on model consistency (FedAWAC). Specifically, FedAWAC measures the model consistency on devices and dynamically adjusts the aggregation weights of each local model, thereby guiding the global model toward optimal updates. Furthermore, FedAWAC integrates $\mathcal {M}$ historical global models most correlated to the latest global model on the cloud server to overcome catastrophic forgetting. Experiments on four different datasets (nonidentically and independently distributed settings) indicate that compared to five baselines, FedAWAC can improve global model accuracy by an average of 1.86%, reduce the forgetting rate by an average of 3.91%, and save average memory usage by up to 2.57 GB.
Loading