This paper concerns a convex, stochastic zeroth-order optimization (S-ZOO) problem. The objective is to minimize the expectation of a cost function whose gradient is not directly accessible. For this problem, traditional optimization algorithms mostly yield query complexities that grow polynomially with dimensionality (the number of decision variables). Consequently, these methods may not perform well in solving massive-dimensional problems arising in many modern applications. Although more recent methods can be provably dimension-insensitive, almost all of them require arguably more stringent conditions such as everywhere sparse or compressible gradient. In this paper, we propose a sparsity-inducing stochastic gradient-free (SI-SGF) algorithm, which provably yields a dimension-free (up to a logarithmic term) query complexity in both convex and strongly convex cases. Such insensitivity to the dimensionality growth is proven, for the first time, to be achievable when neither gradient sparsity nor gradient compressibility is satisfied. Our numerical results demonstrate a consistency between our theoretical prediction and the empirical performance.
翻译:本文涉及一个混凝土、随机零顺序优化(S-ZOOO)问题。 目的是尽可能降低对成本函数的预期值, 其梯度无法直接获取。 对于这个问题,传统优化算法大多会产生与维度(决定变量的数量)成倍增长的查询复杂性。 因此,这些方法在解决许多现代应用中出现的大规模维度问题方面可能无法很好地发挥作用。 尽管较新的方法可能具有可辨别的维度不敏感性, 但几乎所有方法都要求比较严格的条件, 如无处不在的或可压缩的梯度。 在本文中,我们建议采用一种宽度诱导出随机梯度无梯度(SI-SGF)算法,这种算法可以产生一个无维度(直到一个对数术语)的查询复杂性。 这样的对维度增长的敏感性首次被证明,当我们理论预测和实验性表现不尽一致的时候, 。