The Gaussian Neural ProcessDownload PDF

Published: 21 Dec 2020, Last Modified: 12 Mar 2024AABI2020Readers: Everyone
Keywords: Meta-Learning, Neural Processes, Gaussian Processes
Abstract: Neural Processes (NPs; Garnelo et al., 2018a,b) are a rich class of models for meta-learning that map data sets directly to predictive stochastic processes. We provide a rigorous analysis of the standard maximum-likelihood objective used to train conditional NPs. Moreover, we propose a new member to the Neural Process family called the Gaussian Neural Process (GNP), which models predictive correlations, incorporates translation equivariance, provides universal approximation guarantees, and demonstrates encouraging performance.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2101.03606/code)
1 Reply

Loading