Reparameterizing Hybrid Markov Logic Networks to handle Covariate-Shift in Representations

Published: 07 May 2025, Last Modified: 13 Jun 2025UAI 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: hybrid markov logic networks, statistical relational models, inference, reparameterization
TL;DR: We introduce a mixture model to reduce variance in HMLN parameterization and a reparameterization technique to tackle covariate shifts in DNN embeddings.
Abstract: We utilize Hybrid Markov Logic Networks (HMLNs) to combine embeddings learned from a Deep Neural Network (DNN) with symbolic relational knowledge. Since a DNN may not always learn optimal embeddings, we develop a mixture model to reduce variance in the HMLN parameterization. Further, we perform inference in our model that is robust to covariate shifts that may occur in the DNN embeddings by reparameterizing the HMLN. We evaluate our approach on Graph Neural Networks and show that our approach outperforms state-of-the-art methods that combine relational knowledge with DNN embeddings when we introduce covariate shifts in the embeddings. Further, we demonstrate the utility of our approach in inferring latent student knowledge in a cognitive model called Deep Knowledge Tracing.
Latex Source Code: zip
Code Link: https://github.com/anupshakya07/uquant
Signed PMLR Licence Agreement: pdf
Readers: auai.org/UAI/2025/Conference, auai.org/UAI/2025/Conference/Area_Chairs, auai.org/UAI/2025/Conference/Reviewers, auai.org/UAI/2025/Conference/Submission720/Authors, auai.org/UAI/2025/Conference/Submission720/Reproducibility_Reviewers
Submission Number: 720
Loading