Incorporating Prior Knowledge into Neural Networks through an Implicit Composite KernelDownload PDF

16 May 2022 (modified: 03 Jul 2024)NeurIPS 2022 SubmittedReaders: Everyone
Keywords: Neural network, Gaussian process, Composite kernel, Nystrom method
TL;DR: This paper presents an Implicit Composite Kernel (ICK) framework by combining a neural-network-implied kernel with a chosen kernel function followed by a kernel-to-latent-space transformation based on the Nystrom approximation.
Abstract: It is challenging to guide neural network (NN) learning with prior knowledge. In contrast, many known properties, such as spatial smoothness or seasonality, are straightforward to model by choosing an appropriate kernel in a Gaussian process (GP). Many deep learning applications could be enhanced by modeling such known properties. For example, convolutional neural networks (CNNs) are frequently used in remote sensing, which is subject to strong seasonal effects. We propose to blend the strengths of deep learning and the clear modeling capabilities of GPs by using a composite kernel that combines a kernel implicitly defined by a neural network with a second kernel function chosen to model known properties (e.g., seasonality). Then, we approximate the resultant GP by combining a deep network and an efficient mapping based on the Nystrom approximation, which we call Implicit Composite Kernel (ICK). ICK is flexible and can be used to include prior information in neural networks in many applications. We demonstrate the strength of our framework by showing its superior performance and flexibility on both synthetic and real-world data sets. The code is available at: https://anonymous.4open.science/r/ICK_NNGP-17C5/.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 2 code implementations](https://www.catalyzex.com/paper/incorporating-prior-knowledge-into-neural/code)
20 Replies

Loading