We introduce Opacus, a free, open-source PyTorch library for training deep learning models with differential privacy (hosted at opacus.ai). Opacus is designed for simplicity, flexibility, and speed. It provides a simple and user-friendly API, and enables machine learning practitioners to make a training pipeline private by adding as little as two lines to their code. It supports a wide variety of layers, including multi-head attention, convolution, LSTM, and embedding, right out of the box, and it also provides the means for supporting other user-defined layers. Opacus computes batched per-sample gradients, providing better efficiency compared to the traditional "micro batch" approach. In this paper we present Opacus, detail the principles that drove its implementation and unique features, and compare its performance against other frameworks for differential privacy in ML.
翻译:我们引入了Opacus(一个自由、开放源码的PyTorrch图书馆),用于培训具有不同隐私的深层学习模式(在Opacus.ai主办),Opacus的设计是简单、灵活和快速的,提供简单和方便用户的API,使机器学习实践者能够通过在其代码中增加两条小条小线,使培训管道成为私人管道。它支持各种各样的层次,包括多头关注、混凝土、LSTM和嵌入,直接从盒子中移出,它还提供了支持其他用户定义的层次的手段。Opacus计算了每个样本的梯度,比传统的“微型批量”方法提供了更好的效率。在本文中我们介绍Opacus,详细说明了推动其实施的原则及其独特特点,并比较其业绩与ML差异隐私的其他框架。