Knowledge Enhanced Embedding: Improve Model Generalization Through Knowledge GraphsDownload PDF

Anonymous

16 Nov 2021 (modified: 05 May 2023)ACL ARR 2021 November Blind SubmissionReaders: Everyone
Abstract: Pre-trained language models have achieved excellent results in NLP and NLI, and since the birth of Bert, various new types of Bert have emerged.They are able to grasp the ubiquitous linguistic representational information from large-scale corpora in different ways, but when reading texts, it is difficult for them to combine and use external knowledge to make inferences about other meanings that the text may contain, as people do.To this end, we propose a linguistic model (K2E-BERT) capable of simply incorporating external knowledge, which fuses information from the knowledge graph (triad) with the entity information in the original text.In order to better integrate external knowledge into the original text without letting it deviate from the original meaning of the sentence, we propose a method called EaKA (Entity and Knowledge Align), which can better distance and combine entities and knowledge so that the model can accept new external knowledge without losing the meaning of the original sentence; additionally, we can easily and beyond Bert without changing the internal structure of Bert, we can easily and go beyond the results of BERT, which shows that our approach is feasible.After our experiments, we found good results in several NLP tasks we selected, which indicated that K2E-BERT easily surpassed BERT in generalization ability, proving its effectiveness.
0 Replies

Loading