We introduce Opacus, a free, open-source PyTorch library for training deep learning models with differential privacy (hosted at https://opacus.ai). Opacus is designed for simplicity, flexibility, and speed. It provides a simple and user-friendly API, and enables machine learning practitioners to make a training pipeline private by adding as little as two lines to their code. It supports a wide variety of layers, including multi-head attention, convolution, LSTM, and embedding, right out of the box, and it also provides the means for supporting other user-defined layers. Opacus computes batched per-sample gradients, providing better efficiency compared to the traditional "micro batch" approach. In this paper we present Opacus, detail the principles that drove its implementation and unique features, and compare its performance against other frameworks for differential privacy in ML.
翻译:我们引入了Opacus,这是一个自由、开放源码的PyTorrch图书馆,用于培训具有不同隐私的深层学习模式(在https://opacus.ai上主持);Opacus的设计是简单、灵活和快速的,它提供了简单和方便用户的API,使机器学习实践者能够通过在其代码中增加两条小条小条小条小条小条来私下提供培训管道;它支持各种各样的层次,包括多头关注、集聚、LSTM和嵌入,直接从盒子里移出,它还提供了支持其他用户定义的层次的手段;Opacus 计算了每串不同样本的梯度,比传统的“微型批量”方法提供了更好的效率;在本文中,我们介绍Opacus,详细说明了推动其实施的原则和独特特征,并将其业绩与ML差异隐私的其他框架进行比较。