In this paper, we are concerned with differentially private SGD algorithms in the setting of stochastic convex optimization (SCO). Most of existing work requires the loss to be Lipschitz continuous and strongly smooth, and the model parameter to be uniformly bounded. However, these assumptions are restrictive as many popular losses violate these conditions including the hinge loss for SVM, the absolute loss in robust regression, and even the least square loss in an unbounded domain. We significantly relax these restrictive assumptions and establish privacy and generalization (utility) guarantees for private SGD algorithms using output and gradient perturbations associated with non-smooth convex losses. Specifically, the loss function is relaxed to have $\alpha$-H\"{o}lder continuous gradient (referred to as $\alpha$-H\"{o}lder smoothness) which instantiates the Lipschitz continuity ($\alpha=0$) and strong smoothness ($\alpha=1$). We prove that noisy SGD with $\alpha$-H\"older smooth losses using gradient perturbation can guarantee $(\epsilon,\delta)$-differential privacy (DP) and attain optimal excess population risk $O\Big(\frac{\sqrt{d\log(1/\delta)}}{n\epsilon}+\frac{1}{\sqrt{n}}\Big)$, up to logarithmic terms, with gradient complexity (i.e. the total number of iterations) $T =O( n^{2-\alpha\over 1+\alpha}+ n).$ This shows an important trade-off between $\alpha$-H\"older smoothness of the loss and the computational complexity $T$ for private SGD with statistically optimal performance. In particular, our results indicate that $\alpha$-H\"older smoothness with $\alpha\ge {1/2}$ is sufficient to guarantee $(\epsilon,\delta)$-DP of noisy SGD algorithms while achieving optimal excess risk with linear gradient complexity $T = O(n).$


翻译:在本文中, 我们关注在建立 Stochatic convex 优化( SCO) 时, 私人的SGD 算法有差异性。 大部分现有工作要求损失持续且高度顺畅, 模型参数需要统一约束。 然而, 这些假设是限制性的, 因为许多流行性损失违反了这些条件, 包括 SVM 的断层损失, 强力回归的绝对损失, 甚至无约束域内的最小平方块损失。 我们大大放松这些限制性假设, 并为使用与非moth convex 损失相关的输出和渐变的SGD 算法保证( ALphax ) 的隐私和一般化( 通用) (美元=1美元) 。 我们证明, 以美元- h- h 平滑度损失1 美元的Order democial delational dislational dislational democial dislational dislate. (美元= dislation) slate slational- disqral demodealtial demotial demode)。

0
下载
关闭预览

相关内容

专知会员服务
50+阅读 · 2020年12月14日
Stabilizing Transformers for Reinforcement Learning
专知会员服务
59+阅读 · 2019年10月17日
【新书】Python编程基础,669页pdf
专知会员服务
194+阅读 · 2019年10月10日
已删除
将门创投
6+阅读 · 2019年6月10日
Arxiv
1+阅读 · 2021年3月16日
Arxiv
0+阅读 · 2021年3月14日
VIP会员
相关资讯
已删除
将门创投
6+阅读 · 2019年6月10日
Top
微信扫码咨询专知VIP会员