Nested Expectations with Kernel Quadrature

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: This paper proposes a new method to estimate nested expectation based on kernel quadrature.
Abstract: This paper considers the challenging computational task of estimating nested expectations. Existing algorithms, such as nested Monte Carlo or multilevel Monte Carlo, are known to be consistent but require a large number of samples at both inner and outer levels to converge. Instead, we propose a novel estimator consisting of nested kernel quadrature estimators and we prove that it has a faster convergence rate than all baseline methods when the integrands have sufficient smoothness. We then demonstrate empirically that our proposed method does indeed require the fewest number of samples to estimate nested expectations over a range of real-world application areas from Bayesian optimisation to option pricing and health economics.
Lay Summary: This paper addresses the problem of estimating nested expectations, or computing double integrals. We propose a novel estimator that achieves improved sample efficiency and lower computational cost. This problem is particularly important in scientific applications such as decision-making and risk management in finance.
Link To Code: https://github.com/hudsonchen/nest_kq
Primary Area: General Machine Learning->Kernel methods
Keywords: Kernel Quadrature, Monte Carlo, Kernel Ridge Regression
Submission Number: 2829
Loading