Deciphering Multi-task Learning: Comparative Insights for Similar and Dissimilar TasksDownload PDF

Anonymous

16 Feb 2024ACL ARR 2024 February Blind SubmissionReaders: Everyone
Abstract: Multi-Task Learning (MTL), emerged as a powerful concept in the era of machine learning, employs a shared model trained to handle multiple tasks at the same time. Numerous advantages of this novel approach inspire us to instigate the insights of various tasks with similar (Identification of Sentiment, Emotion, Sarcasm, Irony, Hate and Offensive) and dissimilar (Identification of Sentiment, Claim, Language) genres and to analyze the change in their performances with respect to long and short head approaches. We shed light on the methods employed and critical observations to promote more efficient learning paradigm across similar and dissimilar tasks.
Paper Type: short
Research Area: Machine Learning for NLP
Contribution Types: NLP engineering experiment
Languages Studied: English
0 Replies

Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview