We propose a second order gradient based method with ADAM and RMSprop for the training of generative adversarial networks. The proposed method is fastest to obtain similar accuracy when compared to prominent second order methods. Unlike state-of-the-art recent methods, it does not require solving a linear system, or it does not require additional mixed second derivative terms. We derive the fixed point iteration corresponding to proposed method, and show that the proposed method is convergent. The proposed method produces better or comparable inception scores, and comparable quality of images compared to other recently proposed state-of-the-art second order methods. Compared to first order methods such as ADAM, it produces significantly better inception scores. The proposed method is compared and validated on popular datasets such as FFHQ, LSUN, CIFAR10, MNIST, and Fashion MNIST for image generation tasks\footnote{Accepted in IJCNN 2023}. Codes: \url{https://github.com/misterpawan/acom}
翻译:我们提出了一种基于二阶梯度的方法,ADAM和RMSprop用于生成对抗网络的训练。与显着的二阶方法相比,所提出的方法在获得相似精度时速度最快。与最新的方法不同,它不需要解决线性系统,也不需要额外的混合二阶导数项。我们推导出与所提出方法相对应的不动点迭代,并表明所提出的方法是收敛的。与其他最近提出的最先进的二阶方法相比,所提出的方法产生更好或相当的Inception分数,并且与其他最近提出的最先进的二阶方法相比,图像的质量相当。与ADAM等一阶方法相比,它产生明显更好的Inception得分。我们在流行的数据集FFHQ,LSUN,CIFAR10,MNIST和时尚MNIST上比较和验证了所提出的方法,以便进行图像生成任务\footnote{本文被IJCNN 2023接收}。代码:\url{https://github.com/misterpawan/acom}