Keywords: Tabular Data, Deep Learning, Generative Modeling, Transformers, Masked Transformers, Synthetic data
TL;DR: Masked Transformers generate high-quality tabular data, scale well, have better privacy, and can handle missing data/conditional generation
Abstract: Autoregressive and Masked Transformers are incredibly effective as generative models and classifiers.
While these models are most prevalent in NLP, they also exhibit strong performance in other domains, such as vision.
This work contributes to the exploration of transformer-based models in synthetic data generation for diverse application domains.
In this paper, we present TabMT, a novel Masked Transformer design for generating synthetic tabular data.
TabMT effectively addresses the unique challenges posed by heterogeneous data fields and is natively able to handle missing data.
Our design leverages improved masking techniques to allow for generation and demonstrates state-of-the-art performance from extremely small to extremely large tabular datasets.
We evaluate TabMT for privacy-focused applications and find that it is able to generate high quality data with superior privacy tradeoffs.
Supplementary Material: zip
Submission Number: 5421
Loading