We prove hardness-of-learning results under a well-studied assumption on the existence of local pseudorandom generators. As we show, this assumption allows us to surpass the current state of the art, and prove hardness of various basic problems, with no hardness results to date. Our results include: hardness of learning shallow ReLU neural networks under the Gaussian distribution and other distributions; hardness of learning intersections of $\omega(1)$ halfspaces, DNF formulas with $\omega(1)$ terms, and ReLU networks with $\omega(1)$ hidden neurons; hardness of weakly learning deterministic finite automata under the uniform distribution; hardness of weakly learning depth-$3$ Boolean circuits under the uniform distribution, as well as distribution-specific hardness results for learning DNF formulas and intersections of halfspaces. We also establish lower bounds on the complexity of learning intersections of a constant number of halfspaces, and ReLU networks with a constant number of hidden neurons. Moreover, our results imply the hardness of virtually all improper PAC-learning problems (both distribution-free and distribution-specific) that were previously shown hard under other assumptions.
翻译:在对当地假冒发电机的存在进行充分研究的假设下,我们证明了学习的难度。正如我们所显示的那样,这一假设使我们超越了艺术的现状,并证明各种基本问题的难度,迄今为止没有硬性结果。我们的结果包括:在高山分布和其他分布下学习浅浅ReLU神经网络的难度;在高山分布和其他分布下学习浅ReLU神经网络的难度;学习半空(1美元)的难度;半空(含1美元条件的DNF公式)和有1美元隐藏神经元的ReLU网络的难度;统一分布下学习不力的确定性有限自动模型的难度;统一分布下学习深度为3美元的布利安电路的难度;以及学习DNF公式和半空间交叉点的清晰性结果。我们还对经常存在的半空空空间的学习交叉复杂性和隐藏神经元的ReLU网络的难度设定了较低的界限。此外,我们得出的具体结果表明,以往所有不适当的PAC学习问题都表现为硬性的分布。