A class of multi-level algorithms for unconstrained nonlinear optimization is presented which does not require the evaluation of the objective function. The class contains the momentum-less AdaGrad method as a particular (single-level) instance. The choice of avoiding the evaluation of the objective function is intended to make the algorithms of the class less sensitive to noise, while the multi-level feature aims at reducing their computational cost. The evaluation complexity of these algorithms is analyzed and their behaviour in the presence of noise is then illustrated in the context of training deep neural networks for supervised learning applications.
翻译:提出一组不受限制的非线性优化的多层次算法,不需要对目标功能进行评价。该类算法包含没有动力的AdaGrad方法,作为一个特例(单级),选择避免对目标功能进行评价是为了降低该类算法对噪音的敏感度,而多层次特性的目的是降低其计算成本。对这些算法的评估复杂性进行了分析,然后在为监督的学习应用培训深层神经网络时说明这些算法在噪音面前的行为。