Abstract: Federated machine unlearning service is an emerging Machine-Learning-as-a-Service (MLaaS) paradigm which supports the request of removing or forgetting the influence of a specific group of data from federated learning services. While current federated unlearning methods work well in removing instances on individual clients, they encounter challenges in addressing multiple forms of unlearning requirements, such as heterogeneous learning models and diverse unlearning contents. In this paper, we propose novel federated few-shot learning and unlearning models inspired by distillation learning. Firstly, we propose a federated metric learning model where only prototypes are transmitted between the server and clients with limited samples. The training data with different input dimensions on local clients are transformed into abstract prototypes with the same length, enabling collaborative training of heterogeneous models among clients. Then, we propose an efficient federated metric unlearning method, where the temporarily stored prototypes are utilized as teacher knowledge to guide and accelerate the retraining process for each unlearning scenario. The complexity of the federated metric unlearning method is analyzed to show the computation time and communication efficiency. Experimental results demonstrate that our approach outperforms baseline methods in terms of accuracy while effectively removing various requested unlearning contents.
External IDs:doi:10.1109/tsc.2025.3645435
Loading