Noisy Scrubber: Unlearning Using Noisy Representations

ICLR 2026 Conference Submission21198 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: machine unlearning, exact unlearning, approximate unlearning, unlearning with representation
TL;DR: We introduce a machine unlearning method, Noisy Scrubber that addresses unlearning scalability by injecting targeted noise into the latent representation and is effective with limited subset of the retain data.
Abstract: Machine Unlearning (MU) aims to remove the influence of specific data points from trained models, with applications ranging from privacy enforcement to debiasing and mitigating data poisoning. Although exact unlearning ensures complete data removal via retraining, this process is computationally intensive, motivating the development of efficient approximate unlearning methods. Existing approaches typically modify model parameters, which limits scalability, introduces instability, and requires extensive tuning. We propose Noisy Scrubber, a novel MU framework that learns to inject perturbations into the latent representations rather than modifying model parameters. To show Noisy Scrubber attains approximate unlearning we theoretically establish bounds on the parameter gap between original and exact unlearned model, as well as on the output discrepancy between Noisy Scrubber and exact unlearning. Empirical results on CIFAR-10, CIFAR-100, and AGNews demonstrate that Noisy Scrubber closely matches exact unlearning while being significantly more efficient, reducing unlearning gaps to 0.024, 0.129, and 0.006, respectively. Moreover, membership inference evaluations confirm that Noisy Scrubber removes information comparably to retraining. Our approach scales across model families in both vision and text, and introduces a flexible, attachable noise module that enables on-demand and reversible unlearning.
Supplementary Material: zip
Primary Area: alignment, fairness, safety, privacy, and societal considerations
Submission Number: 21198
Loading