KGNN-KT: Enhancing Knowledge Tracing in Programming Education Through LLM-Extracted Knowledge Graphs
Abstract: This paper introduces KGNN-KT, an innovative neural knowledge tracing framework that enhances programming education through structured knowledge representation. Our approach combines large language models (LLMs) with graph neural networks to model both student learning patterns and conceptual relationships in programming. The system first constructs a comprehensive knowledge graph by extracting programming concepts from problem descriptions and solutions using LLMs, which captures hierarchical dependencies between data structures, algorithms, and programming paradigms. The KGNN-KT model then processes this structured knowledge alongside student interaction histories through a multimodal architecture that integrates: (1) semantic embeddings of problem texts and code, (2) temporal modeling of student performance trajectories, and (3) graph-enhanced concept representations. Experiments across three programming education datasets demonstrate significant improvements, with an overall AUC of 0.84 (3.9% higher than leading baselines) and particularly strong results on complex problems (+5.3% AUC gain). Our work advances personalized programming education by bridging neural knowledge tracing with explicit knowledge structures, offering both accurate performance prediction and actionable curriculum insights. The system’s modular design supports extensions to diverse programming domains and adaptive learning scenarios.
Loading