Adam and AdaBelief compute and make use of elementwise adaptive stepsizes in training deep neural networks (DNNs) by tracking the exponential moving average (EMA) of the squared-gradient g_t^2 and the squared prediction error (m_t-g_t)^2, respectively, where m_t is the first momentum at iteration t and can be viewed as a prediction of g_t. In this work, we attempt to find out if layerwise gradient statistics can be expoited in Adam and AdaBelief to allow for more effective training of DNNs. We address the above research question in two steps. Firstly, we slightly modify Adam and AdaBelief by introducing layerwise adaptive stepsizes in their update procedures via either pre or post processing. Empirical study indicates that the slight modification produces comparable performance for training VGG and ResNet models over CIFAR10, suggesting that layer-wise gradient statistics plays an important role towards the success of Adam and AdaBelief for at least certian DNN tasks. In the second step, instead of manual setup of layerwise stepsizes, we propose Aida, a new optimisation method, with the objective that the elementwise stepsizes within each layer have significantly small statistic variances. Motivated by the fact that (m_t-g_t)^2 in AdaBelief is conservative in comparison to g_t^2 in Adam in terms of layerwise statistic averages and variances, Aida is designed by tracking a more conservative function of m_t and g_t than (m_t-g_t)^2 in AdaBelief via layerwise orthogonal vector projections. Experimental results show that Aida produces either competitive or better performance with respect to a number of existing methods including Adam and AdaBelief for a set of challenging DNN tasks.
翻译:Adam 和 Adabelief 计算并使用元素性适应步骤来培训深神经网络(DNNS), 方法是跟踪正向梯度 G_ t% 2 的指数移动平均值( EMA) 和正方向梯度预测错误( m_ t- g_ t) 2, 其中 m_ t 是迭代 t 的第一个动力, 可以被视为 g_ t 的预测 。 在这项工作中, 我们试图找出, 亚达和Adabelief 的分层梯度统计是否可以在亚达和Adabelief 中推广, 以便更有效地培训 DNNNN 。 我们用两个步骤解决上述研究问题。 首先, 我们略微修改亚当和Adabelief, 通过预处理前或后处理,在更新程序中引入分层调整步骤。 爱比亚达和Adalief dreadlief 的分数, 显示亚达和Adabelief 的分级数据对于至少是更具有挑战性的。 在第二个步骤中, 亚达比亚达 平平级的阶级 显示Aidal_ 的分级的分级的阶值, 。