This paper concerns a convex, stochastic zeroth-order optimization (S-ZOO) problem, where the objective is to minimize the expectation of a cost function and its gradient is not accessible directly. To solve this problem, traditional optimization techniques mostly yield query complexities that grow polynomially with dimensionality, i.e., the number of function evaluations is a polynomial function of the number of decision variables. Consequently, these methods may not perform well in solving massive-dimensional problems arising in many modern applications. Although more recent methods can be provably dimension-insensitive, almost all of them work with arguably more stringent conditions such as everywhere sparse or compressible gradient. Thus, prior to this research, it was unknown whether dimension-insensitive S-ZOO is possible without such conditions. In this paper, we give an affirmative answer to this question by proposing a sparsity-inducing stochastic gradient-free (SI-SGF) algorithm. It is proved to achieve dimension-insensitive query complexity in both convex and strongly convex cases when neither gradient sparsity nor gradient compressibility is satisfied. Our numerical results demonstrate the strong potential of the proposed SI-SGF compared with existing alternatives.
翻译:本文涉及一个混凝土、随机零排序优化(S-ZOOO)问题,其目标是尽量减少对成本功能的期望,无法直接获得其梯度。为了解决这个问题,传统优化技术大多产生具有多维性、多维性、即功能评估数量是决定变量数的多元功能。因此,这些方法在解决许多现代应用中出现的大规模问题方面可能无法很好地发挥作用。虽然较新的方法可能不敏感,但几乎所有方法都与可能更加严格的条件如无处不在的稀疏或可压缩梯度合作。因此,在进行这项研究之前,尚不清楚不具有多维度不敏感的S-ZOO的可能性。在本文件中,我们通过提出一个具有宽度的诱导引力的无梯度(SI-SGF)算法来肯定地回答这一问题。事实证明,在调和不易变异性或梯度不具有可调的梯度的梯度弹性等情况下,几乎都能实现对维度敏感质性查询的复杂性。我们提出的数字结果表明,与现有的SI-GF相比,具有很强的潜力。