Regularized $f-$Divergence Kernel Tests

Published: 03 Feb 2026, Last Modified: 03 Feb 2026AISTATS 2026 PosterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: We propose a framework to construct kernel-based two-sample tests from any f-divergence, enabling the detection of diverse distributional differences.
Abstract: We propose a framework to construct practical kernel-based two-sample tests from the family of $f$-divergences. The test statistic is computed from the witness function of a regularized variational representation of the divergence, which we estimate using kernel methods. Aggregation is used to adapt the test over hyperparameters such as the kernel bandwidth and the regularization parameter. While our test covers a variety of $f$-divergences, we bring particular focus to the hockey-stick divergence, motivated by its applications to differential privacy auditing and machine unlearning evaluation. We provide theoretical guarantees for statistical test power across our family of $f-$divergence estimates. For two-sample testing, experiments demonstrate that different $f$-divergences are sensitive to different localized differences, illustrating the importance of leveraging diverse statistics. For machine unlearning, we propose a relative test that distinguishes true unlearning failures from safe distributional variations.
Submission Number: 1849
Loading