In recent years, machine learning neural network has penetrated deeply into people's life. As the price of convenience, people's private information also has the risk of disclosure. The "right to be forgotten" was introduced in a timely manner, stipulating that individuals have the right to withdraw their consent from personal information processing activities based on their consent. To solve this problem, machine unlearning is proposed, which allows the model to erase all memory of private information. Previous studies, including retraining and incremental learning to update models, often take up extra storage space or are difficult to apply to neural networks. Our method only needs to make a small perturbation of the weight of the target model and make it iterate in the direction of the model trained with the remaining data subset until the contribution of the unlearning data to the model is completely eliminated. In this paper, experiments on five datasets prove the effectiveness of our method for machine unlearning, and our method is 15 times faster than retraining.
翻译:近年来,机器学习神经网络深入了人们的生活。作为方便的代价,人们的私人信息也存在披露的风险。“被遗忘的权利”是及时引入的,规定个人有权根据本人同意而撤回个人信息处理活动的同意。为了解决这个问题,建议了机器不学习,使模型能够抹去私人信息的所有记忆。以前的研究,包括再培训和逐步学习更新模型,往往占用额外的存储空间,或者难以应用于神经网络。我们的方法只需要对目标模型的重量进行小的扰动,使其在经过培训的模型中以剩余数据作为方向,直到完全消除未学习数据对模型的贡献。在本文中,关于五个数据集的实验证明了我们不学习机器的方法的有效性,我们的方法比再培训快15倍。