Abstract: Quantum federated learning (QFL) incorporates the principles of quantum neural networks (QNNs) and federated learning (FL), ensuring privacy by updating quantum parameters to the server. However, server aggregation in general QFL involves quantum parameters from all local models, leading to performance degradation due to the participation of outlier heterogeneous local models. Inspired by this, a novel QFL algorithm is proposed to enhance learning performance by excluding outliers before aggregation using Lyapunov optimization at QFL server. This Lyapunov optimization is used to take care of the tradeoff between accuracy and latency, where the accuracy is associated with the freshness of aggregation. Additionally, an entropy-and-fidelity aware algorithm is proposed, which relies on the degree control to select outliers for exclusion comprehensively. This algorithm addresses data and quantum aspects by evaluating the imbalance of data classes in each local model using entropy and quantifying the dissimilarity in quantum states between the target model and each local model through fidelity. Experimental results demonstrate that the proposed algorithm outperforms benchmarks, effectively excluding heterogeneous local models to improve performance while ensuring stability.
Loading