In large-scale optimization, when either forming or storing Hessian matrices are prohibitively expensive, quasi-Newton methods are often used in lieu of Newton's method because they only require first-order information to approximate the true Hessian. Multipoint symmetric secant (MSS) methods can be thought of as generalizations of quasi-Newton methods in that they attempt to impose additional requirements on their approximation of the Hessian. Given an initial Hessian approximation, MSS methods generate a sequence of possibly-indefinite matrices using rank-2 updates to solve nonconvex unconstrained optimization problems. For practical reasons, up to now, the initialization has been a constant multiple of the identity matrix. In this paper, we propose a new limited-memory MSS method for large-scale nonconvex optimization that allows for dense initializations. Numerical results on the CUTEst test problems suggest that the MSS method using a dense initialization outperforms the standard initialization. Numerical results also suggest that this approach is competitive with both a basic L-SR1 trust-region method and an L-PSB method.
翻译:在大规模优化中,当赫森基质的形成或储存过于昂贵时,当赫森基质的形成或储存费用极高时,往往使用准纽顿方法来取代牛顿方法,因为它们只要求第一阶信息来接近真正的赫森。多点对称分离(MSS)方法可被视为准牛顿方法的概括性,因为它们试图对赫森基质的近似性施加额外的要求。鉴于最初的赫森近似,MSS方法产生一个可能不定的基质序列,使用第二级更新解决非convex不受限制的优化问题。由于实际原因,到目前为止,初始化一直是身份矩阵的常数倍数。在本文件中,我们提出了一种允许密集初始化的大规模非电离子优化的新型MSS方法。CUTEst测试问题的数值结果表明,使用密度初始化的MS方法比标准初始化更符合标准初始化。数字结果还表明,这一方法与基本的L-SR1信任区域方法和L-B方法具有竞争力。