AGATa: Attention-Guided Augmentation for Tabular Data in Contrastive Learning

Published: 10 Oct 2024, Last Modified: 31 Oct 2024TRL @ NeurIPS 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Tabular domain, Contrastive learning, Input Augmentation
TL;DR: Attention-guided tabular data augmentation for improved contrastive learning and model performance.
Abstract: Contrastive learning has demonstrated significant potential across various domains, including recent applications to tabular data. However, adapting this approach to tabular structures presents distinct challenges, particularly in developing effective augmentation techniques. While existing methods have shown promise, there remains room for improvement in preserving critical feature relationships during the augmentation process. In this paper, we explore an alternative approach that utilizes attention scores to guide augmentation, aiming to introduce meaningful variations while maintaining important feature interactions. This method builds upon existing work in the field, offering a complementary perspective on tabular data augmentation for contrastive learning. Our approach explores two main aspects: 1) Attention-guided Feature Selection, which focuses augmentations on features with lower attention scores, and 2) Dynamic Augmentation Strategy, which alternates between different augmentation techniques during training. This combination aims to maintain key data characteristics while introducing diverse variations. Experimental results suggest that our method performs competitively with existing augmentation techniques in preserving tabular data structure and enhancing downstream task performance.
Submission Number: 74
Loading