We develop a novel stepsize based on \BB method for solving some challenging optimization problems efficiently, named regularized \BB (RBB) stepsize. We indicate that RBB stepsize is the close solution to a $\ell_{2}^{2}$-regularized least squares problem. When the regularized item vanishes, the RBB stepsize reduces to the original \BB stepsize. RBB stepsize includes a class of valid stepsizes, such as another version of \BB stepsize. The global convergence of the corresponding RBB algorithm is proved in solving convex quadratic optimization problems. One scheme for adaptively generating regularization parameters was proposed, named adaptive two-step parameter. An enhanced RBB stepsize is used for solving quadratic and general optimization problems more efficiently. RBB stepsize could overcome the instability of BB stepsize in many ill-conditioned optimization problems. Moreover, RBB stepsize is more robust than BB stepsize in numerical experiments. Numerical examples show the advantage of using the proposed stepsize to solve some challenging optimization problems vividly.
翻译:暂无翻译