Abstract: Federated learning is an emerging machine learning framework, which is commonly used in the structure of distributed machine learning due to its characteristic of “data immutable model motion”. In practical scenarios, the data samples and hardware conditions between clients are highly heterogeneous. The traditional simple aggregation can cause the global model to unintentionally favor certain clients. There is a significant performance gap between vulnerable groups and groups with richer training resources in the global model. This paper proposes Dynamic Momentum-based Federated Learning (DMFL) to address this issue. It dynamically adjusts the client aggregation weight based on historical performance and current round losses in each round. Experimental results show that DMFL can improve the effectiveness of the overall model while reducing the variance of the client accuracy distribution. Compared to existing baselines, the proposed algorithm performs superior fairness in results.