BERT and PALs: Projected Attention Layers for Efficient Adaptation in Multi-Task LearningDownload PDFOpen Website

2019 (modified: 11 Nov 2022)ICML 2019Readers: Everyone
Abstract: Multi-task learning shares information between related tasks, sometimes reducing the number of parameters required. State-of-the-art results across multiple natural language understanding tasks in ...
0 Replies

Loading