Optimizing Embedding Space with Sub-categorical Supervised Pre-training: A Theoretical Approach Towards Improving Sepsis Prediction

Published: 01 Jan 2023, Last Modified: 05 Feb 2025ICHI 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Supervised contrastive learning provides superior performance over self-supervised learning by considering label information in classification tasks. However, this process suffers from collapsing embedding space since the positive samples are randomly selected from the labeled group and are pulled together. In this work, we theoretically guarantee that any pre-training methods that maintain a mixture of sub-class distribution could consistently outperform supervised contrastive pre-training. Furthermore, based on our theoretical analysis, we propose a new pre-training method by adopting an efficient Expectation Maximization learning strategy. Finally, we empirically evaluated our proposed method of sepsis prediction from the PhysioNet/Computing in Cardiology Challenge dataset and showed its superior performance to the state-of-the-art from various perspectives.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview