Abstract: Recently, Aubin et al. (2021) proposed a set of linear low-dimensional problems to precisely evaluate different types of out-of-distribution generalization. In this paper, we show that one of these problems can already be solved by established algorithms, simply by better hyper-parameter tuning. We then propose an enhanced version of the linear unit-tests. To the best of our hyper-parameter search and within the set of algorithms evaluated, AND-mask is the best performing algorithm on this new suite of tests. Our findings on synthetic data are further reinforced by experiments on an image classification task where we introduce spurious correlations.