Sequential Density Ratio Estimation for Simultaneous Optimization of Speed and AccuracyDownload PDF

Published: 12 Jan 2021, Last Modified: 03 Apr 2024ICLR 2021 SpotlightReaders: Everyone
Keywords: Sequential probability ratio test, Early classification, Density ratio estimation
Abstract: Classifying sequential data as early and as accurately as possible is a challenging yet critical problem, especially when a sampling cost is high. One algorithm that achieves this goal is the sequential probability ratio test (SPRT), which is known as Bayes-optimal: it can keep the expected number of data samples as small as possible, given the desired error upper-bound. However, the original SPRT makes two critical assumptions that limit its application in real-world scenarios: (i) samples are independently and identically distributed, and (ii) the likelihood of the data being derived from each class can be calculated precisely. Here, we propose the SPRT-TANDEM, a deep neural network-based SPRT algorithm that overcomes the above two obstacles. The SPRT-TANDEM sequentially estimates the log-likelihood ratio of two alternative hypotheses by leveraging a novel Loss function for Log-Likelihood Ratio estimation (LLLR) while allowing correlations up to $N (\in \mathbb{N})$ preceding samples. In tests on one original and two public video databases, Nosaic MNIST, UCF101, and SiW, the SPRT-TANDEM achieves statistically significantly better classification accuracy than other baseline classifiers, with a smaller number of data samples. The code and Nosaic MNIST are publicly available at https://github.com/TaikiMiyagawa/SPRT-TANDEM.
One-sentence Summary: With a novel sequential density estimation algorithm, we relax critical assumptions of the classical Sequential Probability Ratio Test to be applicable in various real-world scenarios.
Supplementary Material: zip
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Code: [![github](/images/github_icon.svg) TaikiMiyagawa/SPRT-TANDEM](https://github.com/TaikiMiyagawa/SPRT-TANDEM) + [![Papers with Code](/images/pwc_icon.svg) 1 community implementation](https://paperswithcode.com/paper/?openreview=Rhsu5qD36cL)
Data: [Moving MNIST](https://paperswithcode.com/dataset/moving-mnist), [SiW](https://paperswithcode.com/dataset/siw), [UCF101](https://paperswithcode.com/dataset/ucf101)
25 Replies

Loading