Abstract: Relatedness between tasks, which is key to transfer learning, is often characterized by measuring the influence of tasks on one another during sequential or simultaneous training, with tasks being treated as black boxes. In this paper, we propose MetaEval, a set of $101$ NLP tasks. We fit a single transformer to all MetaEval tasks jointly while conditioning it on low-dimensional task embeddings. The resulting task embeddings enable a novel analysis of the relatedness among tasks. We also show that task aspects can be used to predict task embeddings for new tasks without using any annotated examples. Predicted embeddings can modulate the encoder for zero-shot inference and outperform a zero-shot baseline on GLUE tasks. The provided multitask setup can function as a benchmark for future transfer learning research.
Software: zip
Data: zip
0 Replies
Loading