This paper considers the following question: how well can depth-two ReLU networks with randomly initialized bottom-level weights represent smooth functions? We give near-matching upper- and lower-bounds for $L_2$-approximation in terms of the Lipschitz constant, the desired accuracy, and the dimension of the problem, as well as similar results in terms of Sobolev norms. Our positive results employ tools from harmonic analysis and ridgelet representation theory, while our lower-bounds are based on (robust versions of) dimensionality arguments.
翻译:本文件审议了以下问题:具有随机初始化底层加权数的深二RELU网络能代表平稳的功能有多好?我们以利普施茨常数、预期准确性、问题的规模以及索博列夫规范方面的类似结果提供接近匹配的上限和下限($L_2美元)准值。我们的积极结果利用了调力分析和脊柱代表理论的工具,而我们的下限则以(borbust版本的)维度参数为基础。