For $V : \mathbb{R}^d \to \mathbb{R}$ coercive, we study the convergence rate for the $L^1$-distance of the empiric minimizer, which is the true minimum of the function $V$ sampled with noise with a finite number $n$ of samples, to the minimum of $V$. We show that in general, for unbounded functions with fast growth, the convergence rate is bounded above by $a_n n^{-1/q}$, where $q$ is the dimension of the latent random variable and where $a_n = o(n^\varepsilon)$ for every $\varepsilon > 0$. We then present applications to optimization problems arising in Machine Learning and in Monte Carlo simulation.
翻译:$V :\mathbb{R<unk> d\to mathb{R} 强迫性,我们研究的是电磁最小化器的1美元-距离的汇合率,这是用数量有限且抽样数量不菲的噪音取样的1美元的实际最低值,最低值为0美元。我们显示,一般来说,对于快速增长的无约束功能,汇合率受1美元/n<unk> -1/q}美元以上的约束,其中,美元是潜在的随机变量的维度,每1美元=1美元/美元/美元。然后,我们提出优化机器学习和蒙特卡洛模拟中产生的问题的应用。</s>