Entropic Matching for Expectation Propagation of Markov Jump Processes
TL;DR: We approximate the filtering and smoothing of Markov jump processes with discrete-time observations using the entropic matching method within an expectation propagation algorithm.
Abstract: We propose a novel, tractable latent state inference scheme for Markov jump processes, for which exact inference is often intractable. Our approach is based on an entropic matching framework that can be embedded into the well-known expectation propagation algorithm.
We demonstrate the effectiveness of our method by providing closed-form results for a simple family of approximate distributions and apply it to the general class of chemical reaction networks, which are a crucial tool for modeling in systems biology.
Moreover, we derive closed-form expressions for point estimation of the underlying parameters using an approximate expectation maximization procedure.
We evaluate our method across various chemical reaction networks and compare it to multiple baseline approaches, demonstrating superior performance in approximating the mean of the posterior process. Finally, we discuss the limitations of our method and potential avenues for future improvement, highlighting its promising direction for addressing complex continuous-time Bayesian inference problems.
Submission Number: 296
Loading