Gradient Estimation For Exactly-$k$ Constraints

Published: 28 Oct 2023, Last Modified: 28 Oct 2023NeurIPS2023-AI4Science PosterEveryoneRevisionsBibTeX
Keywords: neuro-symbolic AI, probabilistic inference, gradient estimator, constraints
TL;DR: We propose a gradient estimator for the exactly-k constraint under different distribution famillies.
Abstract: The exactly-$k$ constraint is ubiquitous in machine learning and scientific applications, such as ensuring that the sum of electric charges in a neutral atom is zero. However, enforcing such constraints in machine learning models while allowing differentiable learning is challenging. In this work, we aim to provide a ''cookbook'' for seamlessly incorporating exactly-$k$ constraints into machine learning models by extending a recent gradient estimator from Bernoulli variables to Gaussian and Poisson variables, utilizing constraint probabilities. We show the effectiveness of our proposed gradient estimators in synthetic experiments, and further demonstrate the practical utility of our approach by training neural networks to predict partial charges for metal-organic frameworks, aiding virtual screening in chemistry. Our proposed method not only enhances the capability of learning models but also expands their applicability to a wider range of scientific domains where satisfaction of constraints is crucial.
Submission Track: Original Research
Submission Number: 191
Loading