In this work, we introduce pixel wise tensor normalization, which is inserted after rectifier linear units and, together with batch normalization, provides a significant improvement in the accuracy of modern deep neural networks. In addition, this work deals with the robustness of networks. We show that the factorized superposition of images from the training set and the reformulation of the multi class problem into a multi-label problem yields significantly more robust networks. The reformulation and the adjustment of the multi class log loss also improves the results compared to the overlay with only one class as label. https://atreus.informatik.uni-tuebingen.de/seafile/d/8e2ab8c3fdd444e1a135/?p=%2FTNandFDT&mode=list
翻译:在这项工作中,我们引入了像素智慧强拉正常化,插在整形线性单元之后,加上分批正常化,使现代深层神经网络的准确性大为提高,此外,这项工作涉及网络的稳健性。我们表明,将成套培训图像的因子化叠加和将多级问题重新纳入多标签问题会产生更强的网络。多级日志损失的重新拟订和调整也改善了结果,而与标签上只有一类重叠的结果相比,只有一类的结果。https://atreus.informatik.uni-tuebingen.de/seasafile/d/8e2ab8c3fd444ea135?p}2FTNandFDT&mode=list。