Autoencoders are unsupervised neural networks that are used to process and compress input data and then reconstruct the data back to the original data size. This allows autoencoders to be used for different processing applications such as data compression, image classification, image noise reduction, and image coloring. Hardware-wise, re-configurable architectures like Field Programmable Gate Arrays (FPGAs) have been used for accelerating computations from several domains because of their unique combination of flexibility, performance, and power efficiency. In this paper, we look at the different autoencoders available and use the convolutional autoencoder in both FPGA and GPU-based implementations to process noisy static MNIST images. We compare the different results achieved with the FPGA and GPU-based implementations and then discuss the pros and cons of each implementation. The evaluation of the proposed design achieved 80%accuracy and our experimental results show that the proposed accelerator achieves a throughput of 21.12 Giga-Operations Per Second (GOP/s) with a 5.93 W on-chip power consumption at 100 MHz. The comparison results with off-the-shelf devices and recent state-of-the-art implementations illustrate that the proposed accelerator has obvious advantages in terms of energy efficiency and design flexibility. We also discuss future work that can be done with the use of our proposed accelerator.
翻译:自动显示器是用于处理和压缩输入数据的不受监督的神经网络, 用于处理和压缩输入数据, 然后将数据重建回到原始数据大小。 这样可以将自动显示器用于数据压缩、 图像分类、 图像噪声减少和图像颜色等不同处理应用程序。 硬盘和可重新配置的建筑, 如 Field 程序化门阵列( FPGAs) 已经用于加速从多个领域进行计算, 因为它们具有灵活性、 性能和电力效率的独特组合。 在本文中, 我们查看了不同的自动显示器, 并在 FPGA 和 GPU 实施过程中, 使用 脉动自动显示器的自动显示器, 在 FPGGA 和 GPUP 实施过程中, 处理噪音的静态 MNISTI 图像。 我们比较了与 FPGGA 和 GPP 配置工具的相异性能, 以及我们未来设计中的拟议能量消耗效果的功能性能 。 在最新的WMIS 操作中, 也与最新的 Vlaf- preal a laf- laft laft laft laft laft com laft laft laft laft laft lax a laft com com com laft comp laute a laute comp laute laut laut com laut laut laut la la laut laut laut laut laut laticer com la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la la