We give superpolynomial statistical query (SQ) lower bounds for learning two-hidden-layer ReLU networks with respect to Gaussian inputs in the standard (noise-free) model. No general SQ lower bounds were known for learning ReLU networks of any depth in this setting: previous SQ lower bounds held only for adversarial noise models (agnostic learning) or restricted models such as correlational SQ. Prior work hinted at the impossibility of our result: Vempala and Wilmes showed that general SQ lower bounds cannot apply to any real-valued family of functions that satisfies a simple non-degeneracy condition. To circumvent their result, we refine a lifting procedure due to Daniely and Vardi that reduces Boolean PAC learning problems to Gaussian ones. We show how to extend their technique to other learning models and, in many well-studied cases, obtain a more efficient reduction. As such, we also prove new cryptographic hardness results for PAC learning two-hidden-layer ReLU networks, as well as new lower bounds for learning constant-depth ReLU networks from label queries.
翻译:在标准(无噪音)模式中,我们为学习高森输入的双层ReLU网络设定了超极性统计查询(SQ)下限,用于学习双层ReLU网络。在这种环境下,没有普通的SQ下限用于学习任何深度的ReLU网络:以前的SQ下限仅用于对抗性噪音模型(不可知的学习)或相关SQ等限制性模型。先前的工作暗示我们无法取得结果:Vempala 和 Wilmes 显示,普通 SQ 下限无法适用于任何符合简单非退化性条件的、具有真实价值的功能组合。为规避结果,我们完善了因Daniely 和 Vardi 导致Boolean PAC学习问题降低到高斯网络的提升程序。我们展示了如何将其技术推广到其他学习模型(不可知性学习)或更高效的减少。因此,我们也证明PAC 学习双层LULU网络的加密硬性新结果,以及从不断深度的标签网络中学习新的下界查询。