Sequential Query Encoding for Complex Query Answering on Knowledge Graphs

Published: 25 Jun 2023, Last Modified: 25 Jun 2023Accepted by TMLREveryoneRevisionsBibTeX
Abstract: Complex Query Answering (CQA) is an important and fundamental task for knowledge graph (KG) reasoning. Query encoding (QE) is proposed as a fast and robust solution to CQA. In the encoding process, most existing QE methods first parse the logical query into an executable computational direct-acyclic graph (DAG), then use neural networks to parameterize the operators, and finally, recursively execute these neuralized operators. However, the parameterization-and-execution paradigm may be potentially over-complicated, as it can be structurally simplified by a single neural network encoder. Meanwhile, sequence encoders, like LSTM and Transformer, proved to be effective for encoding semantic graphs in related tasks. Motivated by this, we propose sequential query encoding (SQE) as an alternative to encode queries for CQA. Instead of parameterizing and executing the computational graph, SQE first uses a search-based algorithm to linearize the computational graph to a sequence of tokens and then uses a sequence encoder to compute its vector representation. Then this vector representation is used as a query embedding to retrieve answers from the embedding space according to similarity scores. Despite its simplicity, SQE demonstrates state-of-the-art neural query encoding performance on FB15k, FB15k-237, and NELL on an extended benchmark including twenty-nine types of in-distribution queries. Further experiment shows that SQE also demonstrates comparable knowledge inference capability on out-of-distribution queries, whose query types are not observed during the training process.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: Section 3.2: Fixed typo in formula 3 Section 5.5: Added a discussion section on improving the compositional generalizability of the SQE-Transformer model Table 7: Added experimental result on SQE-Transformer with relative positional encoding Section 6: Added references for some recent complex query answering papers Add Acknowledgement
Code: https://github.com/HKUST-KnowComp/SQE
Assigned Action Editor: ~Alessandro_Sordoni1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 896
Loading