Abstract: Learning a function from input and output data pairs is one of the most fundamental tasks in machine learning. In this work, we propose a generalization of the Canonical Polyadic Decomposition (CPD) from tensors to multivariate functions of continuous variables, and show how it can be applied to supervised learning. We approximate a compactly supported multivariate function using a tensor of truncated multidimensional Fourier series coefficients and propose a hidden tensor factorization formulation for learning a low-rank CPD model of the Fourier coefficients tensor. In contrast to prior work, our method is quite general as it can model any compactly supported multivariate function that can be well-approximated by a finite multidimensional Fourier series, and under certain conditions it guarantees that the unknown function is uniquely characterized by the given input-output data. Furthermore, our model naturally allows stochastic gradient updates allowing it to scale to larger datasets. We develop two optimization algorithms and demonstrate promising results on synthetic and real multivariate regression tasks.
0 Replies
Loading