Proximal Interacting Particle Langevin Algorithms

Published: 07 May 2025, Last Modified: 13 Jun 2025UAI 2025 OralEveryoneRevisionsBibTeXCC BY 4.0
Keywords: non-differentiable latent variable models, proximal algorithms, interacting particle algorithms, Langevin dynamics, image deblurring
TL;DR: We introduce algorithms for estimating parameters in non-differentiable statistical models.
Abstract: We introduce a class of algorithms, termed proximal interacting particle Langevin algorithms (PIPLA), for inference and learning in latent variable models whose joint probability density is non-differentiable. Leveraging proximal Markov chain Monte Carlo techniques and interacting particle Langevin algorithms, we propose three algorithms tailored to the problem of estimating parameters in a non-differentiable statistical model. We prove non-asymptotic bounds for the parameter estimates produced by the different algorithms in the strongly log-concave setting and provide comprehensive numerical experiments on various models to demonstrate the effectiveness of the proposed methods. In particular, we demonstrate the utility of our family of algorithms for sparse Bayesian logistic regression, training of sparse Bayesian neural networks or neural networks with non-differentiable activation functions, image deblurring, and sparse matrix completion. Our theory and experiments together show that PIPLA family can be the de facto choice for parameter estimation problems in non-differentiable latent variable models.
Latex Source Code: zip
Code Link: https://github.com/paulaoak/proximal-ipla
Signed PMLR Licence Agreement: pdf
Readers: auai.org/UAI/2025/Conference, auai.org/UAI/2025/Conference/Area_Chairs, auai.org/UAI/2025/Conference/Reviewers, auai.org/UAI/2025/Conference/Submission741/Authors, auai.org/UAI/2025/Conference/Submission741/Reproducibility_Reviewers
Submission Number: 741
Loading