We study stochastic convex optimization with heavy-tailed data under the constraint of differential privacy (DP). Most prior work on this problem is restricted to the case where the loss function is Lipschitz. Instead, as introduced by Wang, Xiao, Devadas, and Xu \cite{WangXDX20}, we study general convex loss functions with the assumption that the distribution of gradients has bounded $k$-th moments. We provide improved upper bounds on the excess population risk under concentrated DP for convex and strongly convex loss functions. Along the way, we derive new algorithms for private mean estimation of heavy-tailed distributions, under both pure and concentrated DP. Finally, we prove nearly-matching lower bounds for private stochastic convex optimization with strongly convex losses and mean estimation, showing new separations between pure and concentrated DP.
翻译:我们根据差异隐私(DP)的限制,用重整数据来研究孔雀优化。以前关于该问题的大部分工作仅限于损失功能为Lipschitz的情况。相反,正如王、小、德瓦达斯和Xu 所介绍的那样,我们研究一般孔雀损失功能,假设梯度的分布与美元时数相联。我们改进了在集中的DP下对高压人口风险的上限,用于计算二次曲线和强烈的二次曲线损失功能。与此同时,我们为在纯度和集中的DP下对重整零售的私人平均估计得出了新的算法。最后,我们证明,对于私人孔螺旋线优化,我们几乎接近接近接近于较低的界限,以强烈的convex损失和平均估计值相匹配,显示了纯度和集中的DP之间的新分离。