Abstract: Hyper-relational knowledge graphs (HKGs) significantly enhance traditional triple-based knowledge graphs (KGs) by introducing role-value pairs. Recently, Transformer-based hyper-relational knowledge graph completion (HKGC) methods have gained widespread adoption and achieved substantial advancements. However, these methods primarily focus on entity prediction during training, leading to suboptimal performance on evaluation metrics like Hit@k. In HKGs, entities and relations are interdependent, and the exclusive emphasis on entity prediction during Transformer training limits the self-attention mechanism’s ability to capture relational properties information adequately. This limitation hinders the model’s ability in learning entity prediction tasks efficiently. Therefore, in this paper, we propose MTL-HKGC, a multi-task learning framework aimed at augmenting the model’s capacity to grasp relational properties information by incorporating a relation prediction task. This augmentation, in turn, enhances the effectiveness of entity prediction. Additionally, we introduce an effective dynamic loss balancing method that adjusts loss weights dynamically based on task difficulty changes during training. This approach enables the model to prioritize the entity prediction task after mastering simpler relation prediction task, thus enhancing HKGC performance. Our experimental findings on two prominent HKGC datasets validate the effectiveness of our proposed MTL-HKGC.
Loading