Accelerating Non-Conjugate Gaussian Processes By Trading Off Computation For Uncertainty

TMLR Paper3529 Authors

21 Oct 2024 (modified: 04 Nov 2024)Under review for TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: Non-conjugate Gaussian processes (NCGPs) define a flexible probabilistic framework to model categorical, ordinal and continuous data, and are widely used in practice. However, exact inference in NCGPs is prohibitively expensive for large datasets, thus requiring approximations in practice. The approximation error adversely impacts the reliability of the model and is not accounted for in the uncertainty of the prediction. We introduce a family of iterative methods that explicitly model this error. They are uniquely suited to parallel modern computing hardware, efficiently recycle computations, and compress information to reduce both the time and memory requirements for NCGPs. As we demonstrate on large-scale classification problems, our method significantly accelerates training compared to competitive baselines by trading off reduced computation for increased uncertainty.
Submission Length: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Sinead_Williamson1
Submission Number: 3529
Loading