Fully analogue neural computation requires hardware that can implement both linear and nonlinear transformations without digital assistance. While analogue in-memory computing efficiently realizes matrix-vector multiplication, the absence of learnable analogue nonlinearities remains a central bottleneck. Here we introduce KANalogue, a fully analogue realization of Kolmogorov-Arnold Networks (KANs) that instantiates univariate basis functions directly using negative-differential-resistance (NDR) devices. By mapping the intrinsic current-voltage characteristics of NDR devices to learnable coordinate-wise nonlinear functions, KANalogue embeds function approximation into device physics while preserving a fully analogue signal path. Using cold-metal tunnel diodes as a representative platform, we construct diverse nonlinear bases and combine them through crossbar-based analogue summation. Experiments on MNIST, FashionMNIST, and CIFAR-10 demonstrate that KANalogue achieves competitive accuracy with substantially fewer parameters and higher crossbar node efficiency than analogue MLPs, while approaching the performance of digital KANs under strict hardware constraints. The framework is not limited to a specific device technology and naturally generalizes to a broad class of NDR devices. These results establish a device-grounded route toward scalable, energy-efficient, fully analogue neural networks.
翻译:全模拟神经计算需要无需数字辅助即可实现线性和非线性变换的硬件。虽然模拟内存内计算能高效实现矩阵-向量乘法,但可学习的模拟非线性单元的缺失仍是核心瓶颈。本文提出KANalogue——一种完全模拟实现的Kolmogorov-Arnold网络(KANs),其直接利用负微分电阻(NDR)器件实例化单变量基函数。通过将NDR器件固有的电流-电压特性映射为可学习的坐标方向非线性函数,KANalogue将函数逼近嵌入器件物理特性中,同时保持全模拟信号通路。以冷金属隧穿二极管为代表平台,我们构建了多样化的非线性基函数,并通过基于交叉阵列的模拟求和进行组合。在MNIST、FashionMNIST和CIFAR-10数据集上的实验表明,相较于模拟多层感知机,KANalogue以显著更少的参数和更高的交叉阵列节点效率获得了具有竞争力的精度,同时在严格硬件约束下逼近数字KANs的性能。该框架不限于特定器件技术,可自然推广至广泛的NDR器件类别。这些成果为构建可扩展、高能效的全模拟神经网络奠定了器件物理基础。