Abstract: Class imbalance, where datasets often lack sufficient samples for minority classes, is a persistent challenge in machine learning. Existing solutions often generate synthetic data to mitigate this issue, but they typically struggle with complex data distributions, primarily because they focus on oversampling the minority class while neglecting the relationships with the majority class. To overcome these limitations, we propose the Contrastive Tabular Variational Autoencoder (CTVAE), which integrates conditional Variational Autoencoders with contrastive learning techniques. CTVAE excels at generating high-quality synthetic samples that capture the intricate data distributions of both minority and majority classes. Additionally, it can be seamlessly integrated with variants of the Synthetic Minority Oversampling Technique (SMOTE) for enhanced effectiveness. Experimental results demonstrate that CTVAE substantially improves classification performance on imbalanced datasets, offering a more robust and holistic solution to the class imbalance problem.
Loading