Bayesian Imbalanced Regression DebiasingDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Imbalanced Regression, Bayesian Debiasing
Abstract: Imbalanced regression, where the training data has an uneven distribution on its range, is widely encountered in the real world, e.g., age estimation (uni-dimensional regression) and pose estimation (multi-dimensional regression). Compared to imbalanced and long-tailed classification, imbalanced regression has its unique challenges as the regression label space can be continuous, boundless, and high-dimensional. In this work, we present a principled framework, Bayesian Posterior Debiasing (Bayesian-PD), for re-balancing the regression among frequent and rare observations. Our key insight is that a balanced posterior can be obtained by debiasing the conditional probability with a regression label space prior. Importantly, through a normalization reparameterization technique, we derive a general debiasing function between the empirical posterior and the balanced posterior without relying on task-specific assumptions. We show that the Bayesian-PD framework has multiple instantiations in both training and testing time, with either closed-form or numerical implementations. We further uncover that several existing methods in imbalanced classification/regression serve as special cases of our Bayesian-PD framework. Extensive experiments on both uni- and multi-dimensional regression benchmarks demonstrate the effectiveness of the Bayesian-PD framework on various real-world tasks. Notably, Bayesian-PD exhibits strong robustness to different skewness of the training distributions.
One-sentence Summary: We present a principled Bayesian framework for debiasing the imbalanced regression among frequent and rare observations.
4 Replies

Loading