This work investigates the asymptotic behaviour of the gradient approximation method called the generalized simplex gradient (GSG). This method has an error bound that at first glance seems to tend to infinity as the number of sample points increases, but with some careful construction, we show that this is not the case. For functions in finite dimensions, we present two new error bounds ad infinitum depending on the position of the reference point. The error bounds are not a function of the number of sample points and thus remain finite.
翻译:这项工作调查了称为通用简单x梯度(GSG)的梯度近似法的无症状行为。 这种方法有误, 乍一看似乎会随着抽样点数量的增加而变得无穷无穷, 但是经过一些仔细的构思, 我们发现情况并非如此。 对于有限维度的函数, 我们根据参考点的位置提出两个新的错误界限。 错误界限不是抽样点数量的函数, 因此仍然是有限的。