We develop two adaptive discretization algorithms for convex semi-infinite optimization, which terminate after finitely many iterations at approximate solutions of arbitrary precision. In particular, they terminate at a feasible point of the considered optimization problem. Compared to the existing finitely feasible algorithms for general semi-infinite optimization problems, our algorithms work with considerably smaller discretizations and are thus computationally favorable. Also, our algorithms terminate at approximate solutions of arbitrary precision, while for general semi-infinite optimization problems the best possible approximate-solution precision can be arbitrarily bad. All occurring finite optimization subproblems in our algorithms have to be solved only approximately, and continuity is the only regularity assumption on our objective and constraint functions. Applications to parametric and non-parametric regression problems under shape constraints are discussed.
翻译:我们开发了两种适应性离散算法,用于 convex 半无限优化,这些算法在以任意精确的近似解决办法进行有限的多次迭代后终止。 特别是,这些算法在经过考虑的优化问题的可行点上终止。 与现有的一般半无限优化问题的有限可行算法相比,我们的算法使用相对较小的离散法,因此在计算上是有利的。 另外,我们的算法以任意精确的近似解决办法终止,而对于一般的半无限期优化问题,可能的最佳近似溶解精确度可能是任意错误的。 我们算法中发生的所有有限优化子问题只能大致解决,而连续性是我们目标和制约功能的唯一常规假设。 讨论了在形状制约下对参数和非参数回归问题的应用。