We give exponential statistical query (SQ) lower bounds for learning two-hidden-layer ReLU networks with respect to Gaussian inputs in the standard (noise-free) model. No general SQ lower bounds were known for learning ReLU networks of any depth in this setting: previous SQ lower bounds held only for adversarial noise models (agnostic learning) or restricted models such as correlational SQ. Prior work hinted at the impossibility of our result: Vempala and Wilmes showed that general SQ lower bounds cannot apply to any real-valued family of functions that satisfies a simple non-degeneracy condition. To circumvent their result, we refine a lifting procedure due to Daniely and Vardi that reduces Boolean PAC learning problems to Gaussian ones. We show how to extend their technique to other learning models and, in many well-studied cases, obtain a more efficient reduction. As such, we also prove new cryptographic hardness results for PAC learning two-hidden-layer ReLU networks, as well as new lower bounds for learning constant-depth ReLU networks from membership queries.
翻译:在标准(无噪音)模式中,我们给Gaussian 输入标准(无噪音)模式的双层ReLU网络提供指数性统计查询(SQ)下限,学习双层ReLU网络。在这种环境下,学习任何深度的ReLU网络没有通用的 SQ 下限:先前的SQ 下限仅用于对抗性噪音模型(无意识学习)或相关SQ等限制性模型。先前的工作暗示我们无法取得结果:Vempala 和 Wilmes 显示,普通 SQ 下限无法适用于任何符合简单非退化条件的功能的真正价值家庭。为规避结果,我们改进了由于Daniely 和 Vardi 将Boolean PAC 学习问题降低到Gausian 网络的提升程序。我们展示了如何将其技术推广到其他学习模型(无意识学习能力), 在许多研究良好的案例中, 获得更高效的减少。 因此, 我们还证明PAC 学习双层 ReLU 网络的加密硬性新结果, 以及学习常深层 ReLU 查询的新的下界。