Keywords: Functional mapping, Gaussian Process
TL;DR: deep GP function-on-function regression
Abstract: Learning mappings between functional spaces, also known as function-on-function regression, plays a crucial role in functional data analysis and has broad applications, DGP-FM spatiotemporal forecasting, curve prediction, and climate modeling. Existing approaches, such as functional linear models and neural operators, either fail to capture complex nonlinearities or lack capacity to provide reliable uncertainty quantification under noisy, sparse, and irregular sampled data. To address these issues, we propose Deep Gaussian Processes for Functional Maps (DGP-FM). Our method designs a sequence of GP-based linear and nonlinear transformations, leveraging integral transforms of kernels, GP interpolation, and nonlinear activations sampled from GPs. A key insight simplifies implementation: under fixed sampling locations, discrete approximations of kernel integral transforms collapse into direct functional integral transforms, enabling flexible incorporation of various integral transform designs. To achieve scalable probabilistic inference, we use inducing points and whitening transformations to develop a variational learning algorithm. Empirical results on real-world and PDE benchmark datasets demonstrate that the advantage of DGP-FM in both predictive performance and uncertainty calibration.
Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)
Submission Number: 22660
Loading