We present polynomial time and sample efficient algorithms for learning an unknown depth-2 feedforward neural network with general ReLU activations, under mild non-degeneracy assumptions. In particular, we consider learning an unknown network of the form $f(x) = {a}^{\mathsf{T}}\sigma({W}^\mathsf{T}x+b)$, where $x$ is drawn from the Gaussian distribution, and $\sigma(t) := \max(t,0)$ is the ReLU activation. Prior works for learning networks with ReLU activations assume that the bias $b$ is zero. In order to deal with the presence of the bias terms, our proposed algorithm consists of robustly decomposing multiple higher order tensors arising from the Hermite expansion of the function $f(x)$. Using these ideas we also establish identifiability of the network parameters under minimal assumptions.
翻译:我们提出多元时间和抽样有效算法,以学习一个未知的深度-2 feedforward神经网络,在轻度的非变性假设下,使用一般 ReLU 激活。特别是,我们考虑学习一个未知的网络,其形式为$f(x) = {a ⁇ mathsf{T ⁇ sigma({W ⁇ mathsf{T}x+b)$,其中从高山分布中抽取美元,而$\sigma(t) =\max(t,0)$ 则是RELU 激活。使用RELU 激活的先前学习网络工作假定偏差为$b$为零。为了处理偏差条件的存在,我们提议的算法由强力解构成因功能的赫米特扩展而成的多个高压($f(x)$)。利用这些想法,我们还在最低假设下确定网络参数的可识别性。