[retracted] We found out that the difference was dependent on the Chainer library, and does not replicate with another library (pytorch) which indicates that the results are probably due to a bug in Chainer, rather than being hardware-dependent. -- old abstract Deep neural networks often present uncertainties such as hardware- and software-derived noise and randomness. We studied the effects of such uncertainty on learning outcomes, with a particular focus on the function of graphics processing units (GPUs), and found that GPU-induced uncertainty increased learning accuracy of a certain deep neural network. When training a predictive deep neural network using only the CPU without the GPU, the learning error is higher than when training the same number of epochs using the GPU, suggesting that the GPU plays a different role in the learning process than just increasing the computational speed. Because this effect cannot be observed in learning by a simple autoencoder, it could be a phenomenon specific to certain types of neural networks. GPU-specific computational processing is more indeterminate than that by CPUs, and hardware-derived uncertainties, which are often considered obstacles that need to be eliminated, might, in some cases, be successfully incorporated into the training of deep neural networks. Moreover, such uncertainties might be interesting phenomena to consider in brain-related computational processing, which comprises a large mass of uncertain signals.
翻译:我们发现,这种差异取决于链路库,并且没有与另一个图书馆(平板电脑)复制,表明其结果可能是由于链路中的错误,而不是依赖硬件。 -- 古老的抽象深神经网络往往带来不确定性,例如硬件和软件产生的噪音和随机性。我们研究了这种不确定性对学习结果的影响,特别侧重于图形处理器(GPUs)的功能,发现GPU引起的不确定性提高了某种深层神经网络的学习准确性。当培训仅使用CPU而没有GP的预测性深神经网络时,学习错误比培训使用GPU的同样数目的神经小组时要高,这表明GPU在学习过程中发挥着不同的作用,而不是仅仅提高计算速度。我们研究了这种不确定性对学习结果的影响,特别侧重于图形处理器(GPUs)的功能,并发现某些类型的神经网络特有的现象。GPU(GPU)特定的计算处理比某些计算机(CPUs)和从硬件得到的不确定性的网络更不确定性,而这些不确定性往往被视为与大脑相关的大量计算障碍。此外,在计算过程中可能需要消除这种不确定性。