Kernel Functional OptimisationDownload PDF

21 May 2021, 20:50 (edited 15 Jan 2022)NeurIPS 2021 PosterReaders: Everyone
  • Keywords: Non-parametric kernels, Kernel learning, Bayesian functional optimisation, Hyperkernels, Kernel machines, Gaussian Process, Hyperparameter tuning, Machine learning, Support Vector Machines, Reproducing Kernel Hilbert Spaces, Black-box optimisation, sample-efficient optimisation, Bayesian methods
  • TL;DR: We propose a novel approach for the optimisation of kernel functionals using efficient Bayesian functional optimisation.
  • Abstract: Traditional methods for kernel selection rely on parametric kernel functions or a combination thereof and although the kernel hyperparameters are tuned, these methods often provide sub-optimal results due to the limitations induced by the parametric forms. In this paper, we propose a novel formulation for kernel selection using efficient Bayesian optimisation to find the best fitting non-parametric kernel. The kernel is expressed using a linear combination of functions sampled from a prior Gaussian Process (GP) defined by a hyperkernel. We also provide a mechanism to ensure the positive definiteness of the Gram matrix constructed using the resultant kernels. Our experimental results on GP regression and Support Vector Machine (SVM) classification tasks involving both synthetic functions and several real-world datasets show the superiority of our approach over the state-of-the-art.
  • Supplementary Material: pdf
  • Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
  • Code:
17 Replies