Hyperdimensional computing (HDC) has emerged as a new light-weight learning algorithm with smaller computation and energy requirements compared to conventional techniques. In HDC, data points are represented by high-dimensional vectors (hypervectors), which are mapped to high-dimensional space (hyperspace). Typically, a large hypervector dimension ($\geq1000$) is required to achieve accuracies comparable to conventional alternatives. However, unnecessarily large hypervectors increase hardware and energy costs, which can undermine their benefits. This paper presents a technique to minimize the hypervector dimension while maintaining the accuracy and improving the robustness of the classifier. To this end, we formulate the hypervector design as a multi-objective optimization problem for the first time in the literature. The proposed approach decreases the hypervector dimension by more than $32\times$ while maintaining or increasing the accuracy achieved by conventional HDC. Experiments on a commercial hardware platform show that the proposed approach achieves more than one order of magnitude reduction in model size, inference time, and energy consumption. We also demonstrate the trade-off between accuracy and robustness to noise and provide Pareto front solutions as a design parameter in our hypervector design.
翻译:超量计算(HDC)已成为一种新的轻量学习算法,与常规技术相比,计算和能量要求较少。在HDC中,数据点由高维矢量(高位量)代表,这些矢量(高位量)被映射到高维空间(超空间)。一般情况下,要达到与常规替代物相仿的进化度,需要巨大的超位体维度(Geq1000美元)才能达到与常规的进化度。然而,不必要大的超量量量量量体会增加硬件和能源成本,从而可能损害其效益。本文件展示了在保持分类器准确度和稳健度的同时最大限度地减少超高维量量量度的技术。为此,我们首次在文献中将超维量量量量量量量量量量设计设计作为多目标优化的问题。拟议的方法将超超量量量量量量量量量量量量量量量量量度减少32美元,同时保持或提高传统HDC所实现的精度。在商业硬件平台上进行的实验表明,拟议方法在模型大小、推算算出时间和能源消耗方面实现一个以上等量级减量级减量级。我们还展示了对噪音的精确度和坚固度之间的偏差设计参数。