Complex Deep Neural Networks such as Capsule Networks (CapsNets) exhibit high learning capabilities at the cost of compute-intensive operations. To enable their deployment on edge devices, we propose to leverage approximate computing for designing approximate variants of the complex operations like softmax and squash. In our experiments, we evaluate tradeoffs between area, power consumption, and critical path delay of the designs implemented with the ASIC design flow, and the accuracy of the quantized CapsNets, compared to the exact functions.
翻译:诸如Capsule Networks(CapsNets)等复杂深神经网络(CapsNets)在计算密集操作的成本上表现出很高的学习能力。为了在边缘设备上部署这些网络,我们提议利用近似计算来设计各种复杂操作的近似变体,比如软体和壁球。 在我们的实验中,我们评估了用ASIC设计流程执行的设计的面积、电耗和关键路径延迟之间的取舍,以及量化的CapsNet的准确性,与精确的功能相比。