Neural gates compute functions based on weighted sums of the input variables. The expressive power of neural gates (number of distinct functions it can compute) depends on the weight sizes and, in general, large weights (exponential in the number of inputs) are required. Studying the trade-offs among the weight sizes, circuit size and depth is a well-studied topic both in circuit complexity theory and the practice of neural computation. We propose a new approach for studying these complexity trade-offs by considering a related algebraic framework. Specifically, given a single linear equation with arbitrary coefficients, we would like to express it using a system of linear equations with smaller (even constant) coefficients. The techniques we developed are based on Siegel's Lemma for the bounds, anti-concentration inequalities for the existential results and extensions of Sylvester-type Hadamard matrices for the constructions. We explicitly construct a constant weight, optimal size matrix to compute the EQUALITY function (checking if two integers expressed in binary are equal). Computing EQUALITY with a single linear equation requires exponentially large weights. In addition, we prove the existence of the best-known weight size (linear) matrices to compute the COMPARISON function (comparing between two integers expressed in binary). In the context of the circuit complexity theory, our results improve the upper bounds on the weight sizes for the best-known circuit sizes for EQUALITY and COMPARISON.
翻译:神经门根据输入变量的加权总和计算函数。 神经门的显性力量( 它可以计算的不同功能的数量) 取决于重量大小, 一般来说, 需要大量的权重( 投入数量的耗减) 。 研究权重大小、 电路大小和深度之间的权衡取舍, 是电路复杂理论和神经计算实践中经过充分研究的课题。 我们提出一种新的方法, 研究这些复杂的权衡取舍。 具体地说, 鉴于一个带有任意系数的单一线性方程式, 我们想用一个具有较小( 甚至不变) 系数的线性方程式系统来表达它。 我们开发的技术基于Siegel's Lemma的边框、 Sylvester 类Hadmarad 矩阵的存取结果的反集中不平等, 以及建筑的扩展。 我们明确构建一个固定的权重、 最优的大小矩阵, 来比较Qality 函数( 在二进制中显示两个以任意系数表示的整数, 在单线性方程式中, 以单线性方程内, 表示的权重) 。