Training deep nets on probabilistic knowledge bases by exploiting ontological constraints

Anonymous

Nov 17, 2018 AKBC 2019 Conference Blind Submission readers: everyone Show Bibtex
  • Keywords: knowledge bases, ontological constraints, neural networks
  • TL;DR: A network that combines ontological constraints with triples extracted by weakly supervised systems - for refining automatically extracted KBs from text.
  • Abstract: We propose a deep learning architecture that encodes a probabilistic knowledge base enriched with relevant logical constraints over the observed assertions, in order to learn suitable embeddings for different downstream tasks such as Automatic Refinement, Link Prediction, etc. Our experiments show that leveraging logical constraints on top of probabilistic assertions in an end-to-end differentiable fashion is effective and improves system performance over other, similar algorithms without requiring additional supervision even when used in large, constraint-rich knowledge bases.
  • Archival status: Non-Archival
  • Subject areas: Machine Learning, Semantic Web, Natural Language Processing, Information Integration
0 Replies

Loading