Keywords: Knowledge Graphs, Knowledge Graph Embeddings, Negatives Generation
TL;DR: We revamp a strategy to generate negatives during the training of a Knowledge Graph Embeddings model respecting the domain and range of relations and we show it brings substantial improvement for standard benchmarks and an ontology-backed dataset.
Abstract: Knowledge Graph Embedding models, representing entities and edges into a low-dimensional space, have been extremely successful at solving tasks related to completing and exploring Knowledge Graphs (KGs). One of the key aspects of training is teaching these models to discriminate between positive and negative facts. Most KGs, however, do not come with ground truth negatives, which makes their synthetic generation a necessity. Different generation strategies can heavily affect the quality of the embeddings, thus making it a primary aspect to consider. We revamp a strategy that generates corruptions respecting the domain and range of relations, we extend its capabilities and and we show our methods brings substantial improvement (+10\% MRR) for standard benchmarks datasets and over +150\% MRR for a larger ontology-backed dataset.
Submission Type: Extended abstract (max 4 main pages).
Software: https://github.com/Accenture/AmpliGraph/tree/paper/LoG-24-OntologyNegatives
Poster: png
Poster Preview: png
Submission Number: 172
Loading