We revisit the problem of differentially private squared error linear regression. We observe that existing state-of-the-art methods are sensitive to the choice of hyper-parameters -- including the ``clipping threshold'' that cannot be set optimally in a data-independent way. We give a new algorithm for private linear regression based on gradient boosting. We show that our method consistently improves over the previous state of the art when the clipping threshold is taken to be fixed without knowledge of the data, rather than optimized in a non-private way -- and that even when we optimize the clipping threshold non-privately, our algorithm is no worse. In addition to a comprehensive set of experiments, we give theoretical insights to explain this behavior.
翻译:我们重新审视了差异私人平方差错线性回归的问题。我们发现,现有最先进的方法对选择超参数十分敏感,包括无法以数据独立的方式优化设定的“缩小阈值 ” ( clipping 阈值 ) 。 我们给出了基于梯度振动的私人线性回归新算法。 我们显示,当剪裁阈值在对数据不知情的情况下被固定时,我们的方法比以往的状态不断改进,而不是非私人方式的优化,而且即使我们非私人方式优化剪裁阈值,我们的算法也并不差。除了一系列全面的实验外,我们还给出理论洞察力来解释这种行为。</s>