Is Parameter Learning via Weighted Model Integration Tractable?Download PDF

Published: 25 Jul 2021, Last Modified: 05 May 2023TPM 2021Readers: Everyone
Keywords: probabilistic models, parameter learning, constraints, hybrid distributions
TL;DR: We investigate the tractability of parameter learning via weighted model integration for hybrid piecewise distributions with algebraic constraints
Abstract: Weighted Model Integration (WMI) is a recent and general formalism for reasoning over hybrid continuous/discrete probabilistic models with logical and algebraic constraints. While many works have focused on inference in WMI models, the challenges of learning them from data have received much less attention. Our contribution is twofold. First, we provide novel theoretical insights on the problem of estimating the parameters of these models from data in a tractable way, generalizing previous results on maximum-likelihood estimation (MLE) to the broader family of log-linear WMI models. Second, we show how our results on WMI can characterize the tractability of inference and MLE for another widely used class of probabilistic models, Hinge Loss Markov Random Fields (HL-MRFs). Specifically, we bridge these two areas of research by reducing marginal inference in HL-MRFs to WMI inference, and thus we open up new interesting applications for both model classes.
1 Reply

Loading