Keywords: knowledge tracing, deep learning, recurrent neural networks, knowledge state modeling
Abstract: Knowledge tracing aims to predict students' future performance based on their past interactions, helping online learning platforms and teachers assess learners' knowledge levels. This technology plays a critical role in achieving large-scale cognitive diagnosis. Recently, deep learning-based knowledge tracing models have demonstrated impressive results, with most research focusing on designing customized network architectures and novel optimization objectives. However, redundant parameters and overly complex loss functions can complicate model training and make it harder to maintain prediction accuracy. To further investigate the effectiveness of simple recurrent neural networks in this field, and to leverage their advantages in handling sequential exercise representation, this paper introduces a GRU-based knowledge tracing model named ExerCAKT (Exercise Context-Aware Knowledge Tracing). This model effectively captures contextual features of exercises and achieves robust knowledge state modeling through the use of a GRU-based knowledge state feature extractor and a GRU-based exercise feature extractor—without relying on additional optimization objectives.The model's superior performance is validated through comparisons with baseline models, such as AKT and SIMPLEKT, on three public datasets in the knowledge tracing domain. Evaluations are conducted using AUC and ACC metrics at both the Knowledge Concept level and the question level. We validated that relying solely on simple recurrent neural networks, combined with appropriate representation methods, can still achieve excellent performance in this field. Our code will be available at xxx (Anonymous URL).
Supplementary Material: zip
Primary Area: applications to neuroscience & cognitive science
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 11406
Loading