Learning New Facts From Knowledge Bases With Neural Tensor Networks and Semantic Word Vectors

Danqi Chen, Richard Socher, Christopher Manning, Andrew Y. Ng

Jan 17, 2013 (modified: Jan 17, 2013) ICLR 2013 conference submission readers: everyone
  • Decision: conferencePoster-iclr2013-workshop
  • Abstract: Knowledge bases provide applications with the benefit of easily accessible, systematic relational knowledge but often suffer in practice from their incompleteness and lack of knowledge of new entities and relations. Much work has focused on building or extending them by finding patterns in large unannotated text corpora. In contrast, here we mainly aim to complete a knowledge base by predicting additional true relationships between entities, based on generalizations that can be discerned in the given knowledgebase. We introduce a neural tensor network (NTN) model which predicts new relationship entries that can be added to the database. This model can be improved by initializing entity representations with word vectors learned in an unsupervised fashion from text, and when doing this, existing relations can even be queried for entities that were not present in the database. Our model generalizes and outperforms existing models for this problem, and can classify unseen relationships in WordNet with an accuracy of 82.8%.

Loading