Active Multitask Learning with Committees

Anonymous

May 16, 2019 Blind Submission readers: everyone
  • Keywords: active learning, multitask learning, online learning, learning task similarities, knowledge transfer
  • TL;DR: We propose an active multitask learning algorithm that achieves knowledge transfer between tasks.
  • Abstract: The cost of annotating training data has traditionally been a bottleneck for supervised learning approaches. The problem is further exacerbated when supervised learning is applied to a number of correlated tasks simultaneously since the amount of labels required scales with the number of tasks. To mitigate this concern, we propose an active multitask learning algorithm that achieves knowledge transfer between tasks. The approach forms a so-called committee for each task that jointly makes decisions and directly shares data across similar tasks. Our approach reduces the number of queries needed during training while maintaining high accuracy on test data. Empirical results on benchmark datasets show significant improvements on both accuracy and number of query requests.
0 Replies

Loading