MemeSem:A Multi-modal Framework for Sentimental Analysis of Meme via Transfer LearningDownload PDF

12 Jun 2020 (modified: 05 May 2023)LifelongML@ICML2020Readers: Everyone
Student First Author: Yes
Keywords: Multimodal, Social Computing, Sentiment Analysis, Transfer Learning, Natural Language Processing, Deep learning
Abstract: In the age of the internet, Memes have grown to be one of the hottest subjects on the internet and arguably. But despite their huge growth, there is not much attention towards meme sentimental analysis. In this paper, we present MemeSem- a multimodal deep neural network framework for sentiment analysis of memes via transfer learning. Our proposed model utilizes VGG19 pre-trained on ImageNet dataset and BERT language model to learn the visual and textual feature of the meme and combine them together to make predictions. We have performed a comparative analysis of MemeSem model with various baseline models. For our experiment, we prepared a dataset consisting of 10,115 internet memes with three sentiment classes- (Positive, Negative and Neutral). Our proposed model outperforms the baseline multimodals and independent unimodals based on either images or text. On an average MemeSem outperform the unimodal and multimodal baseline by 10.69\% and 3.41\%.
TL;DR: Transfer learning based multimodal framework to predict the sentiment hidden in internet memes by the combined analysis of visual and textual attribute of memes.
0 Replies

Loading