We provide computationally efficient, differentially private algorithms for the classical regression settings of Least Squares Fitting, Binary Regression and Linear Regression with unbounded covariates. Prior to our work, privacy constraints in such regression settings were studied under strong a priori bounds on covariates. We consider the case of Gaussian marginals and extend recent differentially private techniques on mean and covariance estimation (Kamath et al., 2019; Karwa and Vadhan, 2018) to the sub-gaussian regime. We provide a novel technical analysis yielding differentially private algorithms for the above classical regression settings. Through the case of Binary Regression, we capture the fundamental and widely-studied models of logistic regression and linearly-separable SVMs, learning an unbiased estimate of the true regression vector, up to a scaling factor.
翻译:我们提供高效的计算、有差别的私人算法,用以计算最不景气、二进制回归和线性回归的经典回归环境,并配有无约束的共差。在工作之前,在对共差的强先验界限下研究了这种回归环境中的隐私限制。我们考虑了高斯边缘的情况,并将最近关于中常和共差估计的有差别的私人技术(Kamath等人,2019年;Karwa和Vadhan,2018年)推广到亚加西制度。我们提供了一种新的技术分析,为以上经典回归环境得出有区别的私人算法。通过二进制回归,我们抓住了基本和广泛研究的物流回归和线性可分离的SVMs模式,学习了对真实回归矢量的不偏差估计,直至一个缩放系数。