Knowledge tracing aims to trace students' evolving knowledge states by predicting their future performance on concept-related exercises. Recently, some graph-based models have been developed to incorporate the relationships between exercises to improve knowledge tracing, but only a single type of relationship information is generally explored. In this paper, we present a novel Dual Graph Ensemble learning method for Knowledge Tracing (DGEKT), which establishes a dual graph structure of students' learning interactions to capture the heterogeneous exercise-concept associations and interaction transitions by hypergraph modeling and directed graph modeling, respectively. To ensemble the dual graph models, we introduce the technique of online knowledge distillation, due to the fact that although the knowledge tracing model is expected to predict students' responses to the exercises related to different concepts, it is optimized merely with respect to the prediction accuracy on a single exercise at each step. With online knowledge distillation, the dual graph models are adaptively combined to form a stronger teacher model, which in turn provides its predictions on all exercises as extra supervision for better modeling ability. In the experiments, we compare DGEKT against eight knowledge tracing baselines on three benchmark datasets, and the results demonstrate that DGEKT achieves state-of-the-art performance.
翻译:最近,我们开发了一些图表模型,以纳入改进知识追踪的练习之间的关系,但一般只探讨一种单一类型的关系信息。在本文件中,我们提出了一个全新的“知识追踪双图表集合学习方法”(DGEKT),它建立了学生学习互动的双图表结构,以分别通过高分图建模和定向图建模,捕捉不同练习-概念协会和互动的转变。为了合并双图表模型,我们采用了在线知识蒸馏技术,因为虽然知识追踪模型预计将预测学生对与不同概念有关的演练的反应,但只是在每一步单项演练的预测准确性方面优化了。通过在线知识蒸馏,双图表模型以适应性的方式组合成一个更强大的教师模型,反过来为所有演练提供其预测,作为更好的建模能力的额外监督。在实验中,我们将DGEKT与三个基准数据集的八项知识追踪基线进行比较,并显示DGET的成绩。