Orthogonal Polynomials Quadrature Algorithm: a functional analytic approach to inverse problems in deep learning
Keywords: Orthogonal polynomials, inverse problem, posterior estimation
TL;DR: A new functional analytic algorithm for the approximation of the posterior distribution and the evidence
Abstract: We present the new Orthogonal Polynomials--Quadrature Algorithm (OPQA), a parallelizable algorithm that solves two common inverse problems in deep learning from a functional analytic approach. First, it finds a smooth probability density function as an estimate of the posterior, which can act as a proxy for fast inference; second, it estimates the evidence, which is the likelihood that a particular set of observations can be obtained. Everything can be parallelized and completed in one pass.
A core component of OPQA is a functional transform of the square root of the joint distribution into a special functional space of our construct. Through this transform, the evidence is equated with the $L^2$ norm of the transformed function, squared. Hence, the evidence can be estimated by the sum of squares of the transform coefficients.
To expedite the computation of the transform coefficients, OPQA proposes a new computational scheme leveraging Gauss--Hermite quadrature in higher dimensions. Not only does it avoid the potential high variance problem associated with random sampling methods, it also enables one to speed up the computation by parallelization, and significantly reduces the complexity by a vector decomposition.
Submission Number: 20
Loading