Hyperdimensional Computing (HDC) has obtained abundant attention as an emerging non von Neumann computing paradigm. Inspired by the way human brain functions, HDC leverages high dimensional patterns to perform learning tasks. Compared to neural networks, HDC has shown advantages such as energy efficiency and smaller model size, but sub-par learning capabilities in sophisticated applications. Recently, researchers have observed when combined with neural network components, HDC can achieve better performance than conventional HDC models. This motivates us to explore the deeper insights behind theoretical foundations of HDC, particularly the connection and differences with neural networks. In this paper, we make a comparative study between HDC and neural network to provide a different angle where HDC can be derived from an extremely compact neural network trained upfront. Experimental results show such neural network-derived HDC model can achieve up to 21% and 5% accuracy increase from conventional and learning-based HDC models respectively. This paper aims to provide more insights and shed lights on future directions for researches on this popular emerging learning scheme.
翻译:作为新兴的非Neumann计算模式,超元计算(HDC)作为新兴的非Neumann计算模式获得了广泛的关注。受人类大脑功能方式的启发,HDC利用高维模式执行学习任务。与神经网络相比,HDC展示了能源效率和较小模型规模等优势,但在尖端应用中具有次等学习能力。最近,研究人员观察到,当与神经网络组件相结合时,HDC能够取得比传统HDC模型更好的性能。这促使我们探索HDC理论基础背后的更深层次的洞察,特别是与神经网络的联系和差异。在本文中,我们进行了HDC和神经网络之间的比较研究,以提供一个不同的角度,从一个经过前方训练的极紧凑神经网络中可以产生HDC。实验结果显示,这种神经网络衍生的HDC模型可以分别从常规和基于学习的HDC模型中获得高达21%和5%的精度增长。本文旨在为这一流行的新兴学习计划的研究提供更多见解和开明的未来方向。