Incorporating Prior Knowledge into Neural Networks through an Implicit Composite Kernel

Published: 04 Mar 2024, Last Modified: 04 Mar 2024Accepted by TMLREveryoneRevisionsBibTeX
Abstract: It is challenging to guide neural network (NN) learning with prior knowledge. In contrast, many known properties, such as spatial smoothness or seasonality, are straightforward to model by choosing an appropriate kernel in a Gaussian process (GP). Many deep learning applications could be enhanced by modeling such known properties. For example, convolutional neural networks (CNNs) are frequently used in remote sensing, which is subject to strong seasonal effects. We propose to blend the strengths of NNs and the clear modeling capabilities of GPs by using a composite kernel that combines a kernel implicitly defined by a neural network with a second kernel function chosen to model known properties (e.g., seasonality). We implement this idea by combining a deep network and an efficient mapping function based on either Nystrom approximation or random Fourier features, which we call Implicit Composite Kernel (ICK). We then adopt a sample-then-optimize approach to approximate the full GP posterior distribution. We demonstrate that ICK has superior performance and flexibility on both synthetic and real-world datasets including a remote sensing dataset. The ICK framework can be used to include prior information into neural networks in many applications.
Submission Length: Long submission (more than 12 pages of main content)
Code: https://github.com/jzy95310/ICK
Supplementary Material: pdf
Assigned Action Editor: ~Lechao_Xiao2
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 1737
Loading