Keywords: covariate shift, covariance, divergence, generative adversarial network, witness function
TL;DR: A method to find the two slices of the feature space where differences between the distributions is most evident.
Abstract: We investigate an interpretable approach to compare two distributions. The approach, max-sliced Bures divergence, approximates the max-sliced Wasserstein distance and projects the distributions into a one-dimensional subspace defined by a `slicing' vector. Unlike heuristic algorithms for the max-sliced Wasserstein-2 distance that are not guaranteed to find the optimal slice, we detail a tractable algorithm that finds the global optimal slice and scales to large sample sizes, due to its expression in terms of second moments. However, it is unable to detect changes in higher-order statistics. To overcome this, we explore using a non-linear mapping provided by the internal representation of a pre-trained neural network (Inception Net). Our approach provides an interpretation of the Fréchet Inception distance by identifying the instances that are either overrepresented or underrepresented with respect to the other sample. We apply the proposed measure to detect class imbalances and underrepresentation within data sets.
1 Reply
Loading