We propose differentially private algorithms for parameter estimation in both low-dimensional and high-dimensional sparse generalized linear models (GLMs) by constructing private versions of projected gradient descent. We show that the proposed algorithms are nearly rate-optimal by characterizing their statistical performance and establishing privacy-constrained minimax lower bounds for GLMs. The lower bounds are obtained via a novel technique, which is based on Stein's Lemma and generalizes the tracing attack technique for privacy-constrained lower bounds. This lower bound argument can be of independent interest as it is applicable to general parametric models. Simulated and real data experiments are conducted to demonstrate the numerical performance of our algorithms.
翻译:我们通过建立预测梯度下降的私人版本,为低维和高维分散通用线性模型(GLMs)的参数估算提出了不同的私人算法。我们通过描述其统计性能和为GLMs设定受隐私限制的小型下限,表明提议的算法几乎是最佳的。 下限是通过一种创新技术获得的,该技术以Stein's Lemma为基础,并概括了隐私受限制的较低界点的追踪攻击技术。这一受限制的较低界点可能具有独立的兴趣,因为它适用于一般参数模型。进行了模拟和实际的数据实验,以展示我们算法的数值性能。