In this paper, we propose a modified nonlinear conjugate gradient (NCG) method for functions with a non-Lipschitz continuous gradient. First, we present a new formula for the conjugate coefficient \beta_k in NCG, conducting a search direction that provides an adequate function decrease. We can derive that our NCG algorithm guarantees strongly convergent for continuous differential functions without Lipschitz continuous gradient. Second, we present a simple interpolation approach that could automatically achieve shrinkage, generating a step length satisfying the standard Wolfe conditions in each step. Our framework considerably broadens the applicability of NCG and preserves the superior numerical performance of the PRP-type methods.
翻译:在本文中,我们对非Lipschitz连续梯度的函数提出了非线性共振梯度(NCG)修改方法。首先,我们为NCG中的共振系数\beta_k提出了一个新的公式,进行适当的功能减少的搜索方向。我们可以推断,我们的NCG算法保证连续差异函数的高度趋同,而没有Lipschitz连续梯度。第二,我们提出了一个简单的内插方法,可以自动实现缩缩缩,在每步中产生一个符合标准沃尔夫条件的阶梯长度。我们的框架大大扩大了NCG的可适用性,并保留了PRP型方法的高级数字性能。