Bert-Pair-Networks for Sentiment Classification

Published: 01 Jan 2020, Last Modified: 20 May 2025ICMLC 2020EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: BERT has demonstrated excellent performance in natural language processing due to the training on large amounts of text corpus in an unsupervised way. However, this model is trained to predict the next sentence, and thus it is good at dealing with sentence pair tasks but may not be sufficiently good for other tasks. In our paper, we introduce a novel representation framework BERT-pair-Networks (p-BERTs) for sentiment classification, where p-BERTs involve adopting BERT to encode sentences for sentiment classification as a classic task of single sentence classification, using the auxiliary sentence, and a feature extraction layer on the top. Results on three datasets show that our method achieves considerably improved performance.
Loading