We consider stochastic convex optimization for heavy-tailed data with the guarantee of being differentially private (DP). Prior work on this problem is restricted to the gradient descent (GD) method, which is inefficient for large-scale problems. In this paper, we resolve this issue and derive the first high-probability bounds for the private stochastic method with clipping. For general convex problems, we derive excess population risks $\Tilde{O}\left(\frac{d^{1/7}\sqrt{\ln\frac{(n \epsilon)^2}{\beta d}}}{(n\epsilon)^{2/7}}\right)$ and $\Tilde{O}\left(\frac{d^{1/7}\ln\frac{(n\epsilon)^2}{\beta d}}{(n\epsilon)^{2/7}}\right)$ under bounded or unbounded domain assumption, respectively (here $n$ is the sample size, $d$ is the dimension of the data, $\beta$ is the confidence level and $\epsilon$ is the private level). Then, we extend our analysis to the strongly convex case and non-smooth case (which works for generalized smooth objectives with H$\ddot{\text{o}}$lder-continuous gradients). We establish new excess risk bounds without bounded domain assumption. The results above achieve lower excess risks and gradient complexities than existing methods in their corresponding cases. Numerical experiments are conducted to justify the theoretical improvement.
翻译:我们考虑将重整数据优化为重整数据( DP) 。 之前关于该问题的工作仅限于梯度下降( GD) 方法, 这对于大规模问题来说是无效的。 在本文中, 我们解决这个问题, 并用剪裁来为私有的随机度方法制定第一个高概率约束。 对于一般的粘结问题, 我们产生超额人口风险 $\ Tilde{ Onleft (\ frac{ d ⁇ 1/7} sqrt} refrac { (n\ epsilon) =2\ beta d ⁇ d ⁇ (n\ nepsilon) =2/7\\ right} 。 之前, 这个问题的工作仅限于渐渐变的渐变法方法, 和 $\\ breaterlid=lated road roleges 。 在不透明化的轨迹分析中, 美元是快速的递增法 。 在不透明性案例中, 和正变法案例中, 将 建立更稳定性 和正变法 。