- Abstract: Detecting the emergence of abrupt property changes in time series is a challenging problem. Kernel two-sample test has been studied for this task making fewer assumptions on the distributions than traditional parametric approaches. However, selecting kernels is non-trivial in practice. Although kernel selection for two-sample test has been studied, the insufficient samples in change point detection (CPD) problem hinders the success of those developed kernel selection algorithms. In this paper, we propose KLCPD, a novel kernel learning framework that optimizes a lower bound of test power via an auxiliary generative model. With deep kernel parameterization, KLCPD endows kernel two-sample test with the data-driven kernel to detect different types of change-points in real-world applications. The proposed approach significantly outperformed other state-of-the-art methods in our comparative evaluation of benchmark datasets.
- Keywords: deep kernel learning, kernel two-sample test, time series change-point detection
- TL;DR: we propose a novel kernel learning framework, namely KL-CPD, for time series CPD that optimizes a lower bound of test power via an auxiliary generative model as a surrogate to the abnormal distribution.