Variational Elliptical Processes

Published: 05 Sept 2023, Last Modified: 05 Sept 2023Accepted by TMLREveryoneRevisionsBibTeX
Abstract: We present elliptical processes—a family of non-parametric probabilistic models that subsumes Gaussian processes and Student's t processes. This generalization includes a range of new heavy-tailed behaviors while retaining computational tractability. Elliptical processes are based on a representation of elliptical distributions as a continuous mixture of Gaussian distributions. We parameterize this mixture distribution as a spline normalizing flow, which we train using variational inference. The proposed form of the variational posterior enables a sparse variational elliptical process applicable to large-scale problems. We highlight advantages compared to Gaussian processes through regression and classification experiments. Elliptical processes can supersede Gaussian processes in several settings, including cases where the likelihood is non-Gaussian or when accurate tail modeling is essential.
Submission Length: Long submission (more than 12 pages of main content)
Previous TMLR Submission Url:
Changes Since Last Submission: The main change from the last submissions is that the regression experiments have been extended according to the suggestions made by Reviewer 9XiL in the comment from July 30. More specifically, we now break down the performance analysis into multiple steps to clarify the impact of different modifications on the likelihood and posterior. Further, we have included the heteroscedastic models in this comparison on real-world datasets. We have also retrained the non-approximated GP baseline as suggested and included an evaluation of how the number of inducing points affects the quality of the variational approximation. Overall, the additional results show that the elliptical process achieves better log-likelihoods, especially on larger datasets. We have also made changes to improve the clarity and readability of the paper. In particular, we removed the multi-path approach, as discussed before, and also removed the parametric version similar to Jankowiak et al. (2020) in favor of keeping the presentation unified based on variational inference. Minor changes: - Added a non-sparse competitor in the classification experiments. - In section 3.1, we now train the likelihood using numerical integration instead of optimizing the ELBO. - Updated several figures.
Assigned Action Editor: ~Sinead_Williamson1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 860