A control theoretic approach is presented in this paper for both batch and instantaneous updates of weights in feed-forward neural networks. The popular Hamilton-Jacobi-Bellman (HJB) equation has been used to generate an optimal weight update law. The remarkable contribution in this paper is that closed form solutions for both optimal cost and weight update can be achieved for any feed-forward network using HJB equation in a simple yet elegant manner. The proposed approach has been compared with some of the existing best performing learning algorithms. It is found as expected that the proposed approach is faster in convergence in terms of computational time. Some of the benchmark test data such as 8-bit parity, breast cancer and credit approval, as well as 2D Gabor function have been used to validate our claims. The paper also discusses issues related to global optimization. The limitations of popular deterministic weight update laws are critiqued and the possibility of global optimization using HJB formulation is discussed. It is hoped that the proposed algorithm will bring in a lot of interest in researchers working in developing fast learning algorithms and global optimization.
翻译:本文介绍了一种控制理论方法,用于分批和即时更新进料神经网络中的重量。流行的汉密尔顿-Jacobi-Bellman(HJB)等式(HJB)已被用于生成最佳重量更新法。本文的显著贡献是,任何进料前网络都可以以简单而优雅的方式实现最佳成本和重量更新的封闭形式解决方案,使用HJB等式进行最佳成本和重量更新。拟议方法已经与一些现有最佳学习算法进行了比较。在计算时间方面,人们发现拟议的方法会更快地趋同。一些基准测试数据,如8位对等、乳腺癌和信用核准以及2D Gabor函数,已被用于验证我们的要求。文件还讨论了与全球优化有关的问题。对流行确定力法更新法的局限性进行了批评,并讨论了使用HJB公式进行全球优化的可能性。希望拟议的算法将使研究人员对制定快速学习算法和全球优化产生极大兴趣。