Unsupervised Cross-Task Generalization via Retrieval AugmentationDownload PDF

13 Mar 2022 (modified: 05 May 2023)LNLSReaders: Everyone
Abstract: Humans can perform unseen tasks by recalling relevant skills that are acquired previously and then generalizing them to the target tasks, even if there is no supervision at all. In this paper, we aim to improve such cross-task generalization ability of massive multi-task language models such as T0 (Sanh et al., 2021) in an unsupervised setting. We propose a retrieval-augmentation method named ReCross that takes a few unlabelled examples as queries to retrieve a small subset of upstream data and uses them to update the multi-task model for better generalization. Our empirical results show that the proposed ReCross consistently outperforms non-retrieval baselines by a significant margin.
Track: Non-Archival (will not appear in proceedings)
0 Replies

Loading