FedQUIT: On-Device Federated Unlearning via a Quasi-Competent Virtual Teacher

16 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: federated unlearning, machine unlearning, federated learning
TL;DR: FedQUIT is an efficient on-device federated unlearning method that penalizes the true-class score via self-distillation on the forget data.
Abstract: Federated Learning (FL) enables the collaborative training of machine learning models without requiring centralized collection of user data. To comply with the right to be forgotten, FL clients should be able to request the removal of their data contributions from the global model. In this paper, we propose FedQUIT, a novel unlearning algorithm that operates directly on client devices that request to remove its contribution. Our method leverages knowledge distillation to remove the influence of the target client’s data from the global model while preserving its generalization ability. FedQUIT adopts a teacher–student framework, where a modified version of the current global model serves as a virtual teacher and the local model acts as the student. The virtual teacher is constructed by adjusting the global model’s outputs on forget data, penalizing the confidence assigned to the true class while preserving relationships among outputs of non-true classes, to simultaneously induce forgetting and retain useful knowledge. As a result, FedQUIT achieves unlearning without making any additional assumption over the standard FedAvg protocol. Evaluation across diverse datasets, data heterogeneity levels, and model architectures shows that FedQUIT achieves superior unlearning compared to six state-of-the-art methods, while significantly reducing cumulative communication and computational overhead relative to retraining from scratch.
Primary Area: other topics in machine learning (i.e., none of the above)
Submission Number: 7563
Loading