We study and introduce new gradient operators in the complex and bicomplex settings, inspired from the well-known Least Mean Square (LMS) algorithm invented in 1960 by Widrow and Hoff for Adaptive Linear Neuron (ADALINE). These gradient operators will be used to formulate new learning rules for the Bicomplex Least Mean Square (BLMS) algorithms. This approach extends both the classical real and complex LMS algorithms.
翻译:我们从1960年Widrow和Hoff为适应性线性中子公司(ADALINE)发明的著名最低中值广场(LMS)算法(LMS)的启发下,在复杂和双复杂环境下研究和引入新的梯度操作员。 这些梯度操作员将被用来为双复杂最低中值算法(BLMS)制定新的学习规则。 这种方法将扩展传统的真实和复杂的LMS算法。