Keywords: statistical distance, probabilistic inference, reductions
Abstract: Statistical distance (also known as total variation distance) and probabilistic inference are fundamental notions, widely used in machine learning, information theory, and high-dimensional statistics.
While there are efficient algorithms that can estimate statistical distance or probabilistic inference in some specific settings, it has remained an open problem to see whether these two notions can be approximately reduced to each other.
In this work, we take the first step in addressing this problem, and show that estimating statistical distance can be reduced to estimating probabilistic inference, via an efficient structure preserving randomized reduction.
This allows us to use approximate inference algorithms to multiplicatively estimate statistical distance in directed graphical models.
Primary Area: learning theory
Submission Number: 5901
Loading