EduLLM: Leveraging Large Language Models and Framelet-Based Signed Hypergraph Neural Networks for Student Performance Prediction
TL;DR: An Exploration of LLMs and Signed Hypergraph Learning for Educational Data Analytics
Abstract: The growing demand for personalized learning underscores the importance of accurately predicting students' future performance to support tailored education and optimize instructional strategies. Traditional approaches predominantly focus on temporal modeling using historical response records and learning trajectories. While effective, these methods often fall short in capturing the intricate interactions between students and learning content, as well as the subtle semantics of these interactions. To address these gaps, we present EduLLM, the first framework to leverage large language models in combination with hypergraph learning for student performance prediction. The framework incorporates FraS-HNN ($\underline{\mbox{Fra}}$melet-based $\underline{\mbox{S}}$igned $\underline{\mbox{H}}$ypergraph $\underline{\mbox{N}}$eural $\underline{\mbox{N}}$etworks), a novel spectral-based model for signed hypergraph learning, designed to model interactions between students and multiple-choice questions. In this setup, students and questions are represented as nodes, while response records are encoded as positive and negative signed hyperedges, effectively capturing both structural and semantic intricacies of personalized learning behaviors. FraS-HNN employs framelet-based low-pass and high-pass filters to extract multi-frequency features. EduLLM integrates fine-grained semantic features derived from LLMs, synergizing with signed hypergraph representations to enhance prediction accuracy. Extensive experiments conducted on multiple educational datasets demonstrate that EduLLM significantly outperforms state-of-the-art baselines, validating the novel integration of LLMs with FraS-HNN for signed hypergraph learning.
Lay Summary: To support more personalized and effective education, it is important to understand how students are likely to perform in the future. Our research introduces a new approach, called EduLLM, that combines the power of large language models with a novel type of network analysis known as signed hypergraph neural networks. This method allows us to model complex learning behaviors by capturing not only who interacts with what learning content, but also whether those interactions are positive or negative, such as correct or incorrect answers. By combining these structural patterns with deeper language understanding, EduLLM offers more accurate predictions of student performance. Beyond educational use, our proposed signed hypergraph neural networks also show strong potential for advancing hypergraph learning and enabling more powerful applications in other domains.
Application-Driven Machine Learning: This submission is on Application-Driven Machine Learning.
Primary Area: Deep Learning->Graph Neural Networks
Keywords: Signed Hypergraph Neural Network, Hypergraph Learning, Graph Neural Networks, Large Language Models
Submission Number: 5416
Loading