ContextFlow++: Generalist-Specialist Flow-based Generative Models with Mixed-variable Context Encoding
Keywords: normalizing flows, contexts, discrete, conditioning, anomaly detection, failure prediction
TL;DR: This paper introduces a method to condition normalizing flows on mixed-variable contexts that explicitly decouples generalist and context-specific knowledge
Abstract: Normalizing flow-based generative models have been widely used in applications where the exact density estimation is of major importance. Recent research proposes numerous methods to improve their expressivity.
However, conditioning on a context is largely overlooked area in the bijective flow research. Conventional conditioning with the vector concatenation is limited to only a few flow types.
More importantly, this approach cannot support a practical setup where a set of context-conditioned (*specialist*) models are trained with the fixed pretrained general-knowledge (*generalist*) model. We propose ContextFlow++ approach to overcome these limitations using an additive conditioning with explicit generalist-specialist knowledge decoupling. Furthermore, we support discrete contexts by the proposed mixed-variable architecture with context encoders. Particularly, our context encoder for discrete variables is a surjective flow from which the context-conditioned continuous variables are sampled. Our experiments on rotated MNIST-R, corrupted CIFAR-10C, real-world ATM predictive maintenance and SMAP unsupervised anomaly detection benchmarks show that the proposed ContextFlow++ offers faster stable training and achieves higher performance metrics. Our code is publicly available at [github.com/gudovskiy/contextflow](https://github.com/gudovskiy/contextflow).
List Of Authors: Gudovskiy, Denis and Okuno, Tomoyuki and Nakata, Yohei
Latex Source Code: zip
Signed License Agreement: pdf
Code Url: https://github.com/gudovskiy/contextflow
Submission Number: 100
Loading