Tensorised Probabilistic Inference for Neural Probabilistic Logic ProgrammingDownload PDF

Published: 26 Jul 2022, Last Modified: 17 May 2023TPM 2022Readers: Everyone
Keywords: neural-symbolic AI, logic, probability, neural networks, probabilistic logic programming, learning and reasoning, tensor inference, parallelism
TL;DR: Use of tensors during probabilistic logic inference to exploit parallelism and achieve a measurable speed-up in learning and inference time.
Abstract: Neural Probabilistic Logic Programming (NPLP) languages have illustrated how to combine the neural paradigm with that of probabilistic logic programming. Together, they form a neural-symbolic framework integrating low-level perception with high-level reasoning. Such an integration has been shown to aid in the limited data regime and to facilitate better generalisation to out-of-distribution data. However, probabilistic logic inference does not allow for data-parallelisation because of the asymmetries arising in the proof trees during grounding. By lifting part of this inference procedure through the use of symbolic tensor operations, facilitating parallelisation, we achieve a measurable speed-up in learning and inference time. We implemented this tensor perspective in the NPLP language DeepProbLog and demonstrated the speed-up in a comparison to its regular implementation that utilises state-of-the-art probabilistic inference techniques.
1 Reply

Loading