The study of adaptive data analysis examines how many statistical queries can be answered accurately using a fixed dataset while avoiding false discoveries (statistically inaccurate answers). In this paper, we tackle a question that precedes the field of study: Is data only valuable when it provides accurate answers to statistical queries? To answer this question, we use Stochastic Convex Optimization as a case study. In this model, algorithms are considered as analysts who query an estimate of the gradient of a noisy function at each iteration and move towards its minimizer. It is known that $O(1/\epsilon^2)$ examples can be used to minimize the objective function, but none of the existing methods depend on the accuracy of the estimated gradients along the trajectory. Therefore, we ask: How many samples are needed to minimize a noisy convex function if we require $\epsilon$-accurate estimates of $O(1/\epsilon^2)$ gradients? Or, might it be that inaccurate gradient estimates are \emph{necessary} for finding the minimum of a stochastic convex function at an optimal statistical rate? We provide two partial answers to this question. First, we show that a general analyst (queries that may be maliciously chosen) requires $\Omega(1/\epsilon^3)$ samples, ruling out the possibility of a foolproof mechanism. Second, we show that, under certain assumptions on the oracle, $\tilde \Omega(1/\epsilon^{2.5})$ samples are necessary for gradient descent to interact with the oracle. Our results are in contrast to classical bounds that show that $O(1/\epsilon^2)$ samples can optimize the population risk to an accuracy of $O(\epsilon)$, but with spurious gradients.
翻译:适应性数据分析研究用固定的数据集来准确解答多少统计问题, 避免错误的发现( 统计不准确的答案 ) 。 在本文中, 我们处理一个先于研究领域的问题 : 数据只有在提供准确的统计查询答案时才值吗? 为了回答这个问题, 我们用Stochastectic Convex 优化化作为案例研究。 在这个模型中, 算法被视为分析师, 谁在每次迭代中查询一个噪音函数的梯度估计值, 并转向其最小值 。 已知 $( 1/\ epsilon_ 2) 的示例可以用来最小化目标函数( O=1/ lipselom_ 2 ) 。 但是, 现有的方法没有一个取决于沿轨迹估计的梯度的准确性 。 因此, 我们需要多少样本来最大限度地减少一个响亮的 convex 函数 。 ( 1/\ exclonlon2) 梯度估计值 的梯度值? 或者说, 准确的梯度估计值是 。 ( =cread sweal serview) rodeal ex) exerview 。 ( we sudeal deal) rodude rodude) 。 我们提供一个最精确的答案 。