For solving a broad class of nonconvex programming problems on an unbounded constraint set, we provide a self-adaptive step-size strategy that does not include line-search techniques and establishes the convergence of a generic approach under mild assumptions. Specifically, the objective function may not satisfy the convexity condition. Unlike descent line-search algorithms, it does not need a known Lipschitz constant to figure out how big the first step should be. The crucial feature of this process is the steady reduction of the step size until a certain condition is fulfilled. In particular, it can provide a new gradient projection approach to optimization problems with an unbounded constrained set. The correctness of the proposed method is verified by preliminary results from some computational examples. To demonstrate the effectiveness of the proposed technique for large-scale problems, we apply it to some experiments on machine learning, such as supervised feature selection, multi-variable logistic regressions and neural networks for classification.
翻译:为了在不受约束的制约下解决一大批非混凝土编程问题,我们提供了一种自我调整的逐步规模战略,其中不包括线上搜索技术,并在轻度假设下确定通用方法的趋同。具体地说,客观功能可能无法满足凝固条件。与下层线搜索算法不同,它不需要已知的Lipschitz常数来决定第一步应该有多大。这一过程的关键特征是稳步缩小步骤大小,直到满足某一条件。特别是,它可以提供一种新的梯度预测方法,用未受约束的限定来优化问题。拟议方法的正确性通过一些计算实例的初步结果加以验证。为了证明拟议的大规模问题技术的有效性,我们将它应用于一些机器学习实验,例如受监督的特性选择、多变后勤回归和用于分类的神经网络。