Traditional learning-based approaches to student modeling (e.g., predicting grades based on measured activities) generalize poorly to underrepresented/minority student groups due to biases in data availability. In this paper, we propose a Multi-Layer Personalized Federated Learning (MLPFL) methodology which optimizes inference accuracy over different layers of student grouping criteria, such as by course and by demographic subgroups within each course. In our approach, personalized models for individual student subgroups are derived from a global model, which is trained in a distributed fashion via meta-gradient updates that account for subgroup heterogeneity while preserving modeling commonalities that exist across the full dataset. To evaluate our methodology, we consider case studies of two popular downstream student modeling tasks, knowledge tracing and outcome prediction, which leverage multiple modalities of student behavior (e.g., visits to lecture videos and participation on forums) in model training. Experiments on three real-world datasets from online courses demonstrate that our approach obtains substantial improvements over existing student modeling baselines in terms of increasing the average and decreasing the variance of prediction quality across different student subgroups. Visual analysis of the resulting students' knowledge state embeddings confirm that our personalization methodology extracts activity patterns which cluster into different student subgroups, consistent with the performance enhancements we obtain over the baselines.
翻译:由于数据提供方面的偏差,对学生建模(例如,根据计量活动预测年级)采取传统的基于学习的传统模式,对代表性不足/少数群体学生群体来说,由于数据提供方面的偏差,对代表性不足/少数群体学生普遍不甚重视。在本文件中,我们建议采用多年级个性化联邦学习(MLPFL)方法,对不同层次的学生群标准,例如课程和每个课程中的人口分组,优化推断准确性;在我们的方法中,每个学生分组的个人化模式来自一个全球模式,该模式通过一个分布式的元化更新来进行个人化模式培训,其中考虑到子群异性,同时保留整个数据集中存在的共同特征。为了评估我们的方法,我们考虑对两个受欢迎的下游学生建模任务、知识追踪和结果预测的案例研究,该方法利用了学生行为模式的多种模式(例如,访问讲座视频和参加论坛)。在网上课程中对三个真实世界数据集的实验表明,我们的方法在现有的学生建模基线上取得了重大改进,在增加平均水平和缩小不同学生分组之间预测质量的差异方面,同时保留整个数据集。我们考虑两种模式的方法。我们的方法,我们考虑两个受欢迎的下下下游学生的视觉分析,从而确认学生在学生的学习的基线上取得不同的升级。