Tent: Fully Test-Time Adaptation by Entropy MinimizationDownload PDF

Sep 28, 2020 (edited Mar 18, 2021)ICLR 2021 SpotlightReaders: Everyone
  • Keywords: deep learning, unsupervised learning, domain adaptation, self-supervision, robustness
  • Abstract: A model must adapt itself to generalize to new and different data during testing. In this setting of fully test-time adaptation the model has only the test data and its own parameters. We propose to adapt by test entropy minimization (tent): we optimize the model for confidence as measured by the entropy of its predictions. Our method estimates normalization statistics and optimizes channel-wise affine transformations to update online on each batch. Tent reduces generalization error for image classification on corrupted ImageNet and CIFAR-10/100 and reaches a new state-of-the-art error on ImageNet-C. Tent handles source-free domain adaptation on digit recognition from SVHN to MNIST/MNIST-M/USPS, on semantic segmentation from GTA to Cityscapes, and on the VisDA-C benchmark. These results are achieved in one epoch of test-time optimization without altering training.
  • One-sentence Summary: Deep networks can generalize better during testing by adapting to feedback from their own predictions.
  • Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
  • Supplementary Material: zip
15 Replies

Loading