Deep Variational Inference Symbolic Regression

17 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: variational inference, symbolic regression, bayesian statistics, neural networks
TL;DR: A deep variational inference approach to symbolic regression
Abstract: A Bayesian inference approach to symbolic regression offers a combination of two powerful interpretability properties. On its own, symbolic regression offers explainable, unconstrained, closed-form expressions. However, combined with Bayesian inference, symbolic regression provides probability distributions over these interpretable models, accounting for real-world, limited, noisy data. Deep Symbolic Regression (DSR) is an algorithm that uses neural networks to perform symbolic regression; however, it aims to locate a single expression that best fits the data, rather than to calculate posteriors. In this work, we introduce Deep Variational Inference Symbolic Regression (DVISR). DVISR extends DSR into a fully Bayesian approach to symbolic regression by replacing the reward function used to train the network with the inner part of the expectation of the evidence lower bound. DVISR also modifies the architecture of the network to output probability distributions over constants within the expressions. This architectural modification enables the computation of the posterior distributions over both the expression trees and the constants contained within them. We show that DVISR can recover the true posterior distribution in simple settings and demonstrate scaling properties as the expression sizes get larger.
Supplementary Material: zip
Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)
Submission Number: 9537
Loading