Deciphering Multi-task Learning: Comparative Insights for Similar and Dissimilar Tasks

ACL ARR 2024 June Submission5804 Authors

16 Jun 2024 (modified: 31 Jul 2024)ACL ARR 2024 June SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Multi-task learning (MTL), which emerged as a powerful concept in the era of machine learning, employs a shared model trained to handle multiple tasks at the same time. Numerous advantages of this novel approach inspire us to investigate the insights of various tasks with similar (Identification of Sentiment, Emotion, Sarcasm, Irony, Hate and Offensive) and dissimilar (Identification of Sentiment, Claim, Language) genres and to analyze the change in their performances with respect to long and short head approaches. We shed light on the methods employed and critical observations to promote more efficient learning paradigm across similar and dissimilar tasks.
Paper Type: Short
Research Area: Machine Learning for NLP
Research Area Keywords: multi-task learning
Contribution Types: NLP engineering experiment
Languages Studied: English
Submission Number: 5804
Loading