Shift is Good: Mismatched Data Mixing Improves Test Performance

Published: 03 Feb 2026, Last Modified: 03 Feb 2026AISTATS 2026 PosterEveryoneRevisionsBibTeXCC BY 4.0
Abstract: We consider training and testing on mixture distributions with different training and test proportions. We show that in many settings, and in some sense generically, distribution shift can be beneficial, and test performance can improve due to mismatched training proportions. In a variety of scenarios, we identify the optimal training proportions and the extent to which such a distribution shift can be beneficial.
Submission Number: 2105
Loading