We show a simple reduction which demonstrates the cryptographic hardness of learning a single periodic neuron over isotropic Gaussian distributions in the presence of noise. More precisely, our reduction shows that any polynomial-time algorithm (not necessarily gradient-based) for learning such functions under small noise implies a polynomial-time quantum algorithm for solving worst-case lattice problems, whose hardness form the foundation of lattice-based cryptography. Our core hard family of functions, which are well-approximated by one-layer neural networks, take the general form of a univariate periodic function applied to an affine projection of the data. These functions have appeared in previous seminal works which demonstrate their hardness against gradient-based (Shamir'18), and Statistical Query (SQ) algorithms (Song et al.'17). We show that if (polynomially) small noise is added to the labels, the intractability of learning these functions applies to all polynomial-time algorithms under the aforementioned cryptographic assumptions. Moreover, we demonstrate the necessity of noise in the hardness result by designing a polynomial-time algorithm for learning certain families of such functions under exponentially small adversarial noise. Our proposed algorithm is not a gradient-based or an SQ algorithm, but is rather based on the celebrated Lenstra-Lenstra-Lov\'asz (LLL) lattice basis reduction algorithm. Furthermore, in the absence of noise, this algorithm can be directly applied to solve CLWE detection (Bruna et al.'21) and phase retrieval with an optimal sample complexity of $d+1$ samples. In the former case, this improves upon the quadratic-in-$d$ sample complexity required in (Bruna et al.'21). In the latter case, this improves upon the state-of-the-art AMP-based algorithm, which requires approximately $1.128d$ samples (Barbier et al.'19).
翻译:我们展示了一个简单的减法, 这表明了学习单一周期性神经的加密硬度。 我们的减法显示, 在有噪音的情况下, 我们的减法显示, 在小噪音的情况下, 学习这些功能的任何多式时算法( 不一定以梯度为基础 ) 意味着要用多式量算法来解决最坏情况拉蒂问题, 其硬性构成基于加密的调制基础。 我们的核心硬性功能组合, 由一层神经网络所接近的, 采取一个普通的 美元周期性函数的形式, 用于对数据进行折叠式预测。 更确切地说, 这些函数出现在先前的精度工作上, 表明它们对于基于梯度的( Shamir' 18) 和SQ( SQ ) 算法( Song et al. 17) 的硬性量值算法。 我们显示, 如果( polynomyal) 小噪音成为标签的基础, 这些功能的易感应应用于上述加密的多式变法变法 。 此外, 我们的解算法性变法在硬性变法基础下, 的精度变法性变法是 。