Neurosymbolic Markov Models

Published: 17 Jun 2024, Last Modified: 15 Jul 20242nd SPIGM @ ICML OralEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Neurosymbolic, Markov Models, Probabilistic Logic Programming
Abstract: Many fields of AI require models that can handle both probabilistic sequential dependencies and logical rules. For example, autonomous vehicles must obey traffic rules in uncertain environments. Deep Markov models excel in managing sequential probabilistic dependencies but fall short in incorporating logical constraints. Conversely, neurosymbolic AI (NeSy) integrates deep learning with logical rules into end-to-end differentiable models, yet struggles to scale in sequential settings. To address these limitations, we introduce neurosymbolic Markov models (NeSy-MM), which merge deep probabilistic Markov models with logic. We propose a scalable strategy for inference and learning in NeSy-MM combining Bayesian statistics, automated reasoning and gradient estimation. Our experimental results demonstrate that this framework not only scales up neurosymbolic inference, but also that incorporating logical knowledge into Markov models improves their performance.
Submission Number: 130
Loading