Tent: Fully Test-Time Adaptation by Entropy MinimizationDownload PDF

28 Sep 2020 (modified: 14 Jan 2021)ICLR 2021 SpotlightReaders: Everyone
  • Keywords: deep learning, unsupervised learning, domain adaptation, self-supervision, robustness
  • Abstract: A model must adapt itself to generalize to new and different data during testing. This is the setting of fully test-time adaptation given only unlabeled test data and the model parameters. We propose test-time entropy minimization (tent): we optimize for model confidence as measured by the entropy of its predictions. During testing, we adapt the model features by estimating normalization statistics and optimizing channel-wise affine transformations. Tent improves robustness to corruptions for image classification on ImageNet and CIFAR-10/100 and achieves state-of-the-art error on ImageNet-C for ResNet-50. Tent shows the feasibility of target-only domain adaptation for digit classification from SVHN to MNIST/MNIST-M/USPS and semantic segmentation from GTA to Cityscapes.
  • One-sentence Summary: Deep networks can generalize better during testing by adapting to feedback from their own predictions.
  • Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
  • Supplementary Material: zip
11 Replies