Abstract: Natural language inference aims to classify the binary relation between opinionated sentences as a contradiction, entailment, or neutral. To accomplish the task, classifiers transform textual data into numerical representations called vectors or embeddings. In this study, both static (Glove, OntoNotes5) and contextual (BERT) word embedding methods are used. Classifying the logical relationships between opinionated sentences is difficult. These sentences have complex grammatical structures to convert them into logical representations, and traditional natural language processing solutions are insufficient to meet the requirement. This study uses Decomposable Attention and Advanced LSTM for Natural Language Inference (ESIM) deep learning methods to perform this classification. The best accuracy score is achieved with 88% using ESIM - BERT on the SNLI corpus.
0 Replies
Loading