NoiseOut: Learning to Gate Improves Robustness in Deep Neural Networks

21 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: robust classifiers; bionic algorithms
Abstract: Deep Neural Networks (DNNs) achieve impressive performance, when trained on datasets of similar distributions. However, they struggle to generalize to novel data, such as image perturbations, when they differ from the training distribution. Using the Integrated Gradients method, we visualize several perturbed features contributing to the higher classification errors. To filter out such distractor features, we take inspiration from the thalamus, which is a biological gating mechanism that improves the signal fidelity of novel stimuli for task completion. Similarly, we propose a novel method called NoiseOut which is a lightweight modular gating mechanism that can be easily integrated with existing DNNs to enhance its robustness to novel image perturbations. When training on the clean datasets, we randomly replaced a subset of the hidden states with normally-sampled values and, augmented the Integrated Gradients analysis method into an additional objective function. With these processes, NoiseOut gradually learned suitable dynamic gating policies to filter out distractor signals and pass task relevant information to the classifier. When evaluating on perturbed datasets, NoiseOut uses the prior learned gating policies to filter out features that negatively influence classification. We demonstrate that our modular NoiseOut mechanism improves existing DNN's robustness to novel perturbations by achieving strong results on the MNIST-C and ImageNet-C benchmarks.
Primary Area: general machine learning (i.e., none of the above)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3555
Loading