Keywords: Deep Learning, Neural Networks, Dropout, Automatic Dropout
TL;DR: Data-drive extensions of Dropout to automatically detect drop rates
Abstract: Over time, it has been shown that Dropout is one of the best techniques to fight overfitting and at the same time improve the overall performance of deep learning models. When training with Dropout, a randomly selected subset of activations are set to zero within each layer based on a hyper-parameter called drop rate. Finding a suitable drop rate can be very expensive, especially nowadays where modern neural networks contain a large number of parameters. We introduce "DropAut", a completely data-driven extension of Dropout which enables the model to learn and adapt the drop rate based on the task and data it is dealing with. However, both Dropout and "DropAut" exploit the same drop rate for all the units of a layer, but this could be a sub-optimal solution since not all of them have the same importance. Based on this, we also propose two DropAut extensions called "UnitsDropAut" and "BottleneckDropAut" which additionally allow the model to learn and use different and specific drop rates for each unit of a layer. We first derived a bound on the generalization performance of Dropout, "DropAut", "UnitsDropAut" and "BottleneckDropAut" and then we evaluated the proposed approaches using different kinds of neural models on a range of datasets, showing good improvements over Dropout in all the experiments conducted. The code is available at https://github.com/$<$anonymous$>$.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: Deep Learning and representational learning
6 Replies
Loading