Conventional neural structures tend to communicate through analog quantities such as currents or voltages, however, as CMOS devices shrink and supply voltages decrease, the dynamic range of voltage/current-domain analog circuits becomes narrower, the available margin becomes smaller, and noise immunity decreases. More than that, the use of operational amplifiers (op-amps) and clocked or asynchronous comparators in conventional designs leads to high energy consumption and large chip area, which would be detrimental to building spiking neural networks. In view of this, we propose a neural structure for generating and transmitting time-domain signals, including a neuron module, a synapse module, and two weight modules. The proposed neural structure is driven by leakage currents in the transistor triode region and does not use op-amps and comparators, thus providing higher energy and area efficiency compared to conventional designs. In addition, the structure provides greater noise immunity due to internal communication via time-domain signals, which simplifies the wiring between the modules. The proposed neural structure is fabricated using TSMC 65 nm CMOS technology. The proposed neuron and synapse occupy an area of 127 um2 and 231 um2, respectively, while achieving millisecond time constants. Actual chip measurements show that the proposed structure successfully implements the temporal signal communication function with millisecond time constants, which is a critical step toward hardware reservoir computing for human-computer interaction.
翻译:常规神经结构往往通过电流或电压等类似数量进行交流,然而,由于CMOS设备缩小和供应电压减少,电压/电流-表面模拟电路的动态范围缩小,现有空间缩小,噪音免疫力下降,超过此,常规设计中使用操作放大器(Op-amps)和时钟或无序参照器,导致能源消耗高,芯片面积大,不利于建立喷射神经网络。鉴于此,我们提议建立一个神经结构,用于生成和传输时间-空间信号,包括神经模块、神经神经神经神经神经元模块和两个重量模块。拟议的神经神经神经结构是由晶体管三角体区域渗漏流驱动的,没有使用Opp-amps和比较器,因此,与常规设计相比,能源和区域效率更高。此外,结构提供了更大的噪音免疫力,因为通过时间-体积信号使各单元之间的电路断线。提议的神经定线结构将使用Snal-minal2号,同时用S-ral-ral-ral-ral-ral-ral-ral-ral-de-ral-ral-ral-de-ral-ral-ral-ral-ral-ral-sm-s-s-s-ral-sal-ral-ral-sal-s-sal-s-s-s-s-xxxxxxxxxxxx-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx