Hyperbolic Hypergraph Transformer With Knowledge State Disentanglement for Knowledge Tracing

Published: 2025, Last Modified: 22 Jan 2026IEEE Trans. Knowl. Data Eng. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Knowledge Tracing (KT) refers to inferring the students’ knowledge mastery and predicting their future performance. KT serves as the foundation for personalized learning and enhances the effectiveness of educational interventions, becoming a crucial technology in intelligent tutoring systems. Recent approaches have demonstrated notable success by harnessing the potent representational capacities of deep learning. However, complex neural networks lead to entangled knowledge state embeddings, where the embedding dimensions are coupled, limiting their expressiveness and interpretability. In addition, the limitations of existing methods in euclidean space result in distortions when capturing complex relationships among knowledge states. This distortion is reflected in the alteration of distances and geometric structures among knowledge states during the embedding process. To address the challenges, in this paper, we propose a hyperbolic hypergraph transformer with knowledge state Disentanglement for Knowledge Tracing, named DisenKT. We construct the students’ response sequences into the hypergraph, projected into the hyperbolic space to alleviate the representation distortion problem of questions and knowledge states. The embeddings of hierarchical knowledge states are refined through message passing between questions and students based on the proposed hyperbolic hypergraph transformer. Moreover, we are the first to disentangle knowledge states via a contrastive clustering auxiliary task, which enhances the expressiveness and interpretability of knowledge state embeddings. Extensive experimental results on three public datasets demonstrate that DisenKT outperforms state-of-the-art methods on student performance prediction and interpretability.
Loading