Approximations of optimization problems arise in computational procedures and sensitivity analysis. The resulting effect on solutions can be significant, with even small approximations of components of a problem translating into large errors in the solutions. We specify conditions under which approximations are well behaved in the sense of minimizers, stationary points, and level-sets and this leads to a framework of consistent approximations. The framework is developed for a broad class of composite problems, which are neither convex nor smooth. We demonstrate the framework using examples from stochastic optimization, neural-network based machine learning, distributionally robust optimization, penalty and augmented Lagrangian methods, interior-point methods, homotopy methods, smoothing methods, extended nonlinear programming, difference-of-convex programming, and multi-objective optimization. An enhanced proximal method illustrates the algorithmic possibilities. A quantitative analysis supplements the development by furnishing rates of convergence.
翻译:在计算程序和敏感度分析中出现优化问题的近似情况。对解决方案的影响可能很大,甚至问题各组成部分的小近似值都转换成解决方案中的大错误。我们具体规定了近近似在最小化、固定点和水平设置方面良好表现的条件,从而形成了一个一致近似框架。框架是为一系列广泛的综合问题制定的,这些问题既非曲折也非光滑。我们用以下实例展示了框架:零碎优化、神经网络机器学习、分布强力优化、惩罚和扩大拉格朗加方法、内部点方法、同质调制方法、平滑方法、扩展的非线性编程、电离子节点编程差异和多目标优化。一个强化的精度方法展示了算的可能性。一个量化分析通过提供趋同率来补充发展。