Stochastic Competition Networks for Deep Learning on Tabular Data

22 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: deep learning, tabular data, transformer, stochastic, lwta
Abstract: Despite the prevalence and significance of tabular data across numerous industries and fields, it has been relatively underexplored in the realm of deep learning. Even today, neural networks are often overshadowed by techniques such as gradient-boosted decision trees (GBDT). However, recent models are beginning to close this gap, outperforming GBDT in various setups and garnering increased attention in the field. Drawing from this inspiration, in this work we introduce a novel deep learning model specifically designed for tabular data. The foundation of this model is a Transformer-based architecture, carefully adapted to cater to the unique properties of tabular data through strategic architectural modifications, mainly two forms of stochastic competition. First, we employ the "Local Winner Takes All" mechanism as a refined alternative to ReLU-activated layers. Second, we introduce a novel embedding layer that blends multiple linear embedding layers through a form of stochastic competition. Model effectiveness is validated on a variety of widely-used, publicly available datasets. We show that, through incorporation of the said stochastic elements, we yield state-of-the-art performance and mark a significant advancement in applying deep learning to tabular data.
Supplementary Material: zip
Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5805
Loading