Modern Hopfield Networks as Memory for Iterative Learning on Tabular Data

Published: 27 Oct 2023, Last Modified: 26 Nov 2023AMHN23 PosterEveryoneRevisionsBibTeX
Keywords: deep learning, tabular data, continuous modern Hopfield networks, associative memory
TL;DR: We propose a novel deep learning architecture based on continuous modern Hopfield networks for tackling small tabular datasets.
Abstract: While Deep Learning excels in structured data as encountered in vision and natural language processing, it failed to meet its expectations on tabular data. For tabular data, Support Vector Machines (SVMs), Random Forests, and Gradient Boosting are the best performing techniques. We suggest "Hopular", a novel Deep Learning architecture for medium- and small-sized datasets, where each layer is equipped with continuous modern Hopfield networks. Hopular's novelty is that every layer can directly access the original input as well as the whole training set via stored data in the Hopfield networks. Therefore, Hopular can step-wise update its current model and the resulting prediction at every layer like standard iterative learning algorithms. In experiments on small-sized tabular datasets with less than 1,000 samples, Hopular surpasses Gradient Boosting, Random Forests, SVMs, and in particular several Deep Learning methods. In experiments on medium-sized tabular data with about 10,000 samples, Hopular outperforms XGBoost, CatBoost, LightGBM and a state-of-the art Deep Learning method designed for tabular data. Thus, Hopular is a strong alternative to these methods on tabular data.
Submission Number: 2
Loading