Abstract: The study of ensemble learning in knowledge graph embedding (KGE) shows that combining multiple individual KGE models can perform better on knowledge graph completion. However, existing KGE ensemble methods ignore the creation of model diversity because these methods independently train individual models, which are short of training interaction. To create rich model diversity, we propose a novel training method for ensemble bilinear models (EBM) for the problem of knowledge graph completion. EBM uses a weighted loss to allow individual KGE models to interact during training. In this way, the relations in the knowledge graph can be automatically modeled by the most appropriate model from the KGE individual ones. The experiments on knowledge graph completion show that EBM has richer diversity and performs better than the single bilinear model and the ensemble methods without interaction.
Loading