Abstract: Knowledge Graph Embedding (KGE) maps entities and relations into continuous vector spaces to facilitate link prediction tasks. Given the inherent inability of knowledge graphs to directly supply high-quality negative samples with multi-level difficulty, existing methods typically rely on post-sampling assessment strategies, which lack controllable generation of difficulty-calibrated negatives tailored to diverse KGE training requirements. To address these challenges, we propose ConDNS, a novel conditional diffusion-based negative sampling method for knowledge graph embedding. By adjusting the diffusion timestep, our model achieves dynamic difficulty modulation of synthetic negatives through global entity-relation information utilization. This enables generation of semantically valid samples that synergistically integrate with conventional samples, thereby overcoming single-strategy sampling bottlenecks and establishing a multiscale difficulty configuration. Experiments demonstrate that ConDNS achieves state-of-the-art performance across multiple benchmarks with minimal synthetic samples while functioning as a plug-and-play module compatible with mainstream KGE architectures. Source code is available at: https://github.com/zrj-wang/ConDNS.
External IDs:dblp:journals/ijon/WangLCLG26
Loading