Deep Differentiable Symbolic Regression Neural Network

Published: 01 Jan 2025, Last Modified: 06 Mar 2025Neurocomputing 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Deep learning (DL) methods have recently been applied to the symbolic regression (SR) challenge. However, for SR problems, DL methods are non-differentiable due to SR’s discrete search space, resulting in longer training time and challenging convergence. To address this, we introduce a new approach: Deep Differentiable Symbolic Regression Neural Network (DDSR-NN). Within DDSR-NN, every layer represents a continuous distribution of mathematical operators, and the connections between layers represent the coefficient between these operators. So, DDSR-NN represents the distribution of mathematical expression. To ensure efficiency and avoid overly complex results, DDSR-NN utilizes gate masks to prune results and rule masks to reduce search space. Furthermore, we developed a hybrid training algorithm, merging the gradient descent method with a risk-seeking policy gradient method. This ensures that DDSR-NN swiftly converges to the distribution of mathematical expressions with minimal loss. Experiments conducted on 35 distinct datasets revealed that DDSR-NN outperforms 23 baseline algorithms in striking a balance between accuracy and interpretability. Notably, when compared to the three neural network (NN) baselines, DDSR-NN consistently yielded more accurate outcomes.
Loading