DSparsE: Dynamic Sparse Embedding for Knowledge Graph Completion

22 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: representation learning for computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Knowledge graph completion, Link prediction, Dynamic learning, Sparse embedding
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: Addressing the incompleteness problem in knowledge graphs remains a significant challenge. Current graph completion methods, such as ComDensE (a representative of the fully connected network) and InteractE (a representative of the convolutional network), have certain limitations. Specifically, ComDensE is prone to overfitting and has constraints on network depth, while InteractE has limitations in feature interaction and interpretability. To overcome these drawbacks, we propose the Dynamic Sparse Embedding (DSparsE) model. This model employs sparse learning techniques, replacing the conventional dense layers with adaptable sparse ones. DSparsE incorporates a structure reminiscent of the Mixture of Experts (MoE) at the encoding stage and a residual structure at the decoding stage, which optimizes feature extraction and decoding without a significant increase of parameters. Comparative tests are evaluated on the FB15k-237 and WN18RR datasets. It is demonstrated that DSparsE outperforms both ComDensE and InteractE on FB15k-237 in terms of hits@1, with improvements of 2.3\% and 3.0\%, respectively.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 4899
Loading