RETRA: Recurrent Transformers for Learning Temporally Contextualized Knowledge Graph EmbeddingsDownload PDF

Published: 23 Feb 2021, Last Modified: 05 May 2023ESWC 2021 ResearchReaders: Everyone
Keywords: Knowledge Graph Embedding, Contextualized Embeddings, Modeling Temporal Context
Abstract: Knowledge graph embeddings (KGE) are vector representations that capture the global distributional semantics of each entity instance and relation type in a static Knowledge Graph (KG). While KGEs have the capability to embed information related to an entity into a single representation, they are not customizable to a specific context. This is fundamentally limiting for many applications, since the latent state of an entity can change depending on the current situation and the entity's history of related observations. Such context-specific roles an entity might play cannot be captured in global KGEs, since it requires to generate an embedding unique for each situation. This paper proposes a KG modeling template for temporally contextualized observations and introduces the Recurrent Transformer (RETRA), a neural encoder stack with a feedback loop and constrained multi-headed self-attention layers. RETRA enables to transform global KGEs into custom embeddings, given the situation-specific factors of the relation and the subjective history of the entity. This way, entity embeddings for down-stream Knowledge Graph Tasks (KGT) can be contextualized, like link prediction for location recommendation, event prediction, or driving-scene classification. Our experimental results demonstrate the benefits standard KGEs can gain, if they are customized according to the situational context.
Subtrack: Machine Learning
First Author Is Student: Yes
9 Replies

Loading