Standard artificial neural networks (ANNs) use sum-product or multiply-accumulate node operations with a memoryless nonlinear activation. These neural networks are known to have universal function approximation capabilities. Previously proposed morphological perceptrons use max-sum, in place of sum-product, node processing and have promising properties for circuit implementations. In this paper we show that these max-sum ANNs do not have universal approximation capabilities. Furthermore, we consider proposed signed-max-sum and max-star-sum generalizations of morphological ANNs and show that these variants also do not have universal approximation capabilities. We contrast these variations to log-number system (LNS) implementations which also avoid multiplications, but do exhibit universal approximation capabilities.
翻译:标准人造神经网络(ANNs)使用无内存非线性激活的合成产品或倍累积节点操作。这些神经网络已知具有通用功能近似能力。以前提议的形态偏角使用最大和值,代替总和、节点处理,并具有实施电路的有希望特性。在本文中,我们表明这些最大和负负的ANNs不具备通用近似能力。此外,我们考虑了所拟议的形态非线性激活的签名-最大和最大和最大和星-总和通用功能,并表明这些变异也不具备通用近效能力。我们将这些变异与日志编号系统(LNS)的实施进行了对比,后者也避免了倍增,但展示了通用近似能力。