TabContrast: A Local-Global Level Method for Tabular Contrastive Learning

Published: 28 Oct 2023, Last Modified: 15 Nov 2023TRL @ NeurIPS 2023 PosterEveryoneRevisionsBibTeX
Keywords: Tabular Learning, Contrastive Learning
TL;DR: This paper proposes a method that using interaction between samples to construct positive and negative pairs for tabular contrastive learning.
Abstract: Representation learning is a cornerstone of contemporary artificial intelligence, significantly boosting performance across diverse downstream tasks. Notably, domains like computer vision and NLP have witnessed transformative advancements owing to self-supervised contrastive learning techniques. Yet, the translation of these techniques to tabular data remains an intricate challenge. Traditional approaches, especially within the tabular arena, tend to explore model architecture and loss function design, often overlooking the nuanced creation of positive and negative sample pairs. These pairs are vital, shaping the quality of the learned representations and the overall model efficacy. Recognizing this imperative, our paper probes the specificities of tabular data and the unique challenges it presents. As a solution, we introduce "TabContrast". This method adopts a local-global contrast approach, segmenting features into subsets and subsequently performing tailored clustering to unveil inherent data patterns. By aligning samples with cluster centroids and emphasizing clear semantic distinctions, TabContrast promises enhanced representation efficacy. Preliminary evaluations highlight its potential, particularly in tabular datasets with more features available.
Submission Number: 51