Distilcyphergpt: enhancing large language models for knowledge graph question answering in cypher through knowledge distillation
Abstract: Knowledge Graph Question Answering (KGQA) systems allow users to interact with knowledge graphs using natural language queries, which are translated into structured database queries like Cypher. Existing KGQA approaches often rely on large language models, leading to high computational costs and slower inference times that impede real-time applications. To address these challenges, DistilCypherGPT is introduced as an efficient KGQA framework employing knowledge distillation in a teacher-student architecture, optimized for Cypher query generation on academic knowledge graphs. DistilCypherGPT significantly reduces computational demands, enabling deployment in resource-constrained environments while retaining high accuracy. Experimental results show that DistilCypherGPT maintains 99.51% accuracy, achieving a 23% reduction in model size and a 30% improvement in inference speed compared to the baseline. These findings corroborate DistilCypherGPT’s potential as a scalable, high-performance solution for KGQA, advancing efficient, real-time query translation with minimal computational overhead.
External IDs:dblp:journals/datamine/ChongLL25
Loading