Sample-aware RandAugment

15 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: representation learning for computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Data augmentation, automated machine learning, computer vision, image recognition
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: We propose a search-free sample-aware automatic data augmentation method that shrinks the gap between search-free and search-based ones with only x0.1 times extra training time.
Abstract: Automatic data augmentation (AutoDA) improves the generalization of neural networks by filling in the missing data in the target distribution. However, mainstream AutoDA methods suffer from either a time-consuming search process that sets barriers for a wide range of applications, or limited performance due to a lack of dynamic adjustments to policy during training. We propose an asymmetric search-free augmentation strategy Sample-aware RandAugment (SRA) that dynamically adjusts the augmentation policy while maintaining a simple implementation. SRA introduces a heuristic score-based module to dynamically evaluate the difficulty of the original training data, which guides the appropriate augmentation independently for each sample. SRA consists of three steps: 1) distribution exploration, 2) sample perception, and 3) distribution refinement. In a variety of settings, SRA significantly shrinks the gap between search-based and search-free AutoDA methods. The proposed method achieves 78.31% ResNet-50 Top-1 accuracy on ImageNet, which is the state-of-the-art among search-free methods. SRA can lead to simpler, more effective, and more practical AutoDA designs for diverse applications in the future.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 72
Loading