Computing Tractable Probabilistic Models: A Hardware Perspective

Published: 17 Jun 2025, Last Modified: 20 Jun 2025TPM 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Tractable probabilistic models, Probabilistic circuits, Hardware acceleration
Abstract: Several deep learning models recently raised overconfidence and reliability concerns, and alternative models for trustworthy and explicit decision-making systems are on the rise. Among them, tractable probabilistic models (TPMs) have recently gained significant interest, exploiting their tractability for energy-efficient and general-purpose inference. Yet, although "software" implementations of TPMs have shown great potential, the hardware computation and acceleration of these models is still largely underexplored. In this work, we offer a perspective on why this is the case, and elaborate on what can be done to design more efficient processors suited for TPMs. Our analysis shows that although research seems currently fragmented, several pieces of the puzzle can be combined to enable a larger use and a more efficient computation of TPMs in edge AI systems.
Submission Number: 2
Loading