everyone
since 20 Feb 2025">EveryoneRevisionsBibTeXCC BY 4.0
Machine unlearning is a challenging task in the federated learning (FL) ecosystem due to its decentralized nature. Many existing approaches rely on retraining the model from scratch, which is computationally inefficient. We propose a novel federated unlearning method that addresses this inefficiency by partitioning the global and local model parameter spaces into subspaces. During federated training, we cluster the parameter space from all clients and map it to corresponding neurons in the global and local models. When an unlearning request is made, neurons specific to the unlearning class are frozen, effectively neutralizing its contribution. Evaluated on MNIST and CIFAR-10 datasets, the method achieves complete unlearning for targeted classes, with accuracies dropping to 0.00% for ”Airplane” in CIFAR-10 and digit 9 in MNIST, while preserving baseline performance for other classes, such as 98.50% for digit 1 and 69.50% for ”Ship.” On average, the method retains 95.2% accuracy for unaffected classes.