A Random Matrix Analysis of Learning with α-DropoutDownload PDF

Published: 06 Jul 2020, Last Modified: 05 May 2023ICML Artemiss 2020Readers: Everyone
Abstract: This article studies a single hidden layer neural network with generalized Dropout (α-Dropout), where the dropped out features are replaced with an arbitrary value α. Specifically, under a large dimensional data and network regime, we provide the generalization performances for this network on a binary classification problem. We notably demonstrate that a careful choice of α different from 0 can drastically improve the generalization performances of the classifier.
Keywords: random matrix theory, dropout, zero imputation
2 Replies

Loading