This is paper for the smooth function approximation by neural networks (NN). Mathematical or physical functions can be replaced by NN models through regression. In this study, we get NNs that generate highly accurate and highly smooth function, which only comprised of a few weight parameters, through discussing a few topics about regression. First, we reinterpret inside of NNs for regression; consequently, we propose a new activation function--integrated sigmoid linear unit (ISLU). Then special charateristics of metadata for regression, which is different from other data like image or sound, is discussed for improving the performance of neural networks. Finally, the one of a simple hierarchical NN that generate models substituting mathematical function is presented, and the new batch concept ``meta-batch" which improves the performance of NN several times more is introduced. The new activation function, meta-batch method, features of numerical data, meta-augmentation with metaparameters, and a structure of NN generating a compact multi-layer perceptron(MLP) are essential in this study.
翻译:这是神经网络平滑函数近似(NN)的纸张。 数学或物理函数可以通过回归由 NN 模型取代。 在本研究中, 我们得到的 NN 功能产生高度准确和高度顺畅的功能, 仅通过讨论几个关于回归的参数而产生一些重量参数。 首先, 我们重新解释 NN 内部的回归功能; 因此, 我们提议一个新的激活功能- 集成的血浆线性单元( ISLU) 。 然后, 将讨论回归元数据的特殊结构, 它不同于像图像或声音等其他数据, 以改善神经网络的性能。 最后, 提出了生成模型取代数学函数的简单级 NNN 的等级, 新的批量概念“ meta-batch ”, 使NN 的性能提高数倍以上。 新的激活功能、 元组合方法、 数字数据特征、 元参数的元缩放和 NN 结构, 在这项研究中至关重要 。