EMU: EFFICIENT NEGATIVE SAMPLE GENERATION METHOD FOR KNOWLEDGE GRAPH LINK PREDICTION

20 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: knowledge base, ling prediction, representation learning, negative sample generation
TL;DR: proposing a new efficient negative sample generation method for knowledge graph link prediction
Abstract: Knowledge graph embedding (KGE) models encode information in knowledge graphs for the purpose of predicting new links. In order to effectively train these models, it is essential to learn to discriminate between positive and negative samples. Prior research has demonstrated that enhancing the quality of negative samples can lead to significant improvements in model accuracy. To this end, our paper proposes Embedding Mutation and Unbounded label Smoothing (EMU), a novel approach to generating hard negative samples, distinct from traditional endeavors aimed at identifying more difficult negatives within the training data. By corrupting the negative samples with mutations derived from true samples, EMU creates more challenging negative samples that are harder to distinguish from true samples. Importantly, EMU’s simplicity allows it to be seamlessly integrated with existing KGE models and other negative sampling methods. Our experiments show that EMU can be integrated with any KGE models or existing negative sampling techniques and consistently yields improved link prediction performance. An implementation of the method and experiments are available at \url{https://anonymous.4open.science/r/EMU-KG-6E58}.
Supplementary Material: pdf
Primary Area: representation learning for computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 2406
Loading