In human learning, an effective skill in improving learning outcomes is learning by teaching: a learner deepens his/her understanding of a topic by teaching this topic to others. In this paper, we aim to borrow this teaching-driven learning methodology from humans and leverage it to train more performant machine learning models, by proposing a novel ML framework referred to as learning by teaching (LBT). In the LBT framework, a teacher model improves itself by teaching a student model to learn well. Specifically, the teacher creates a pseudo-labeled dataset and uses it to train a student model. Based on how the student performs on a validation dataset, the teacher re-learns its model and re-teaches the student until the student achieves great validation performance. Our framework is based on three-level optimization which contains three stages: teacher learns; teacher teaches student; teacher re-learns based on how well the student performs. A simple but efficient algorithm is developed to solve the three-level optimization problem. We apply LBT to search neural architectures on CIFAR-10, CIFAR-100, and ImageNet. The efficacy of our method is demonstrated in various experiments.
翻译:在人类学习中,提高学习成果的有效技能是通过教学学习:一个学习者通过向他人教授这个课题来加深对一个主题的理解。在本文中,我们的目标是向人类借用这种由教学驱动的学习方法,并通过提出称为通过教学学习(LBT)的新颖的ML框架来培训更多表演机学习模式。在LBT框架内,教师模式通过教授学生学习模式来改善自己。具体地说,教师创建了一个假标签数据集,并用它来培训学生模型。根据学生如何在验证数据集上表现,教师重新学习其模型,并让学生重新学习,直到学生取得优秀的验证性业绩。我们的框架基于三个层次的优化:教师学习;教师教授学生;教师根据学生的学习成绩而重新学习。开发了一个简单而有效的算法,以解决三级优化问题。我们应用LBT来搜索CFAR-10、CIFAR-100和图像网络的神经结构。我们的方法的效能在各种实验中展示。