Scalable Hybrid Hidden Markov Model with Gaussian Process Emission for Sequential Time-series ObservationsDownload PDF

Published: 21 Dec 2020, Last Modified: 05 May 2023AABI2020Readers: Everyone
Keywords: Hybrid Bayesian Hidden Markov Model, Gaussian Process, Spectral Mixture Kernel, Stochastic Variational Inference, Stochastic Gradient Variational Inference, Random Fourier Feature, Scalable Approximate Bayesian Inference
Abstract: A hidden Markov model (HMM) using Gaussian Process as an emission model has been widely used to model sequential data in complex form. This study particularly introduces the hybrid Bayesian HMM with GP emission using SM kernel, which we call HMM-GPSM, for estimating a hidden state of each time-series observation sequentially observed from a single channel. We then propose a scalable learning method to train the HMM-GPSM model using large-scale data having (1) long sequences of state transitions and (2) a large number of time-series observations for each hidden state. For a long sequence of state transitions, we employ stochastic variational inference (SVI) to efficiently update the parameters of HMM-GPSM. For a large number of data points in each time series observation, we propose the approximate GP emission using the spectral points sampled from the spectral density of SM kernel by Random Fourier feature (RFF) and the efficient inference for the kernel hyperparameters of approximate GP emission and corresponding HMM-GPSM. Specifically, we derive the training loss, i.e., the evidence lower bound of the HMM-GPSM that can be scalably computed for a large number of time-series observations by employing the regularized lower bound of GP emission likelihood with KL divergence. The proposed methods can be used together for the sequential time-series dataset having both (1) and (2). We validate the proposed method on the synthetic using the clustering accuracy and training time as the performance metrics.
1 Reply

Loading