Spectral Convolutional Conditional Neural Processes

ICLR 2024 Workshop TS4H Submission9 Authors

Published: 08 Mar 2024, Last Modified: 27 Mar 2024TS4H PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Neural Process, Stochastic Process, Uncertainty Estimation, Irregularly Sampled Time Series
Abstract: Conditional Neural Processes (CNPs) constitute a family of probabilistic models that harness the flexibility of neural networks to parameterize stochastic processes. Their capability to furnish well-calibrated predictions, combined with simple maximum-likelihood training, has established them as appealing solutions for addressing various learning problems, with a particular emphasis on meta-learning. A prominent member of this family, Convolutional Conditional Neural Processes (ConvCNPs), utilizes convolution to explicitly introduce translation equivariance as an inductive bias. However, ConvCNP's reliance on local discrete kernels in its convolution layers can pose challenges in capturing long-range dependencies and complex patterns within the data, especially when dealing with limited and irregularly sampled observations from a new task. Building on the successes of Fourier neural operators (FNOs) for approximating the solution operators of parametric partial differential equations (PDEs), we propose Spectral Convolutional Conditional Neural Processes (SConvCNPs), a new addition to the NPs family that allows for more efficient representation of functions in the frequency domain.
Submission Number: 9
Loading