This paper concentrates on the approximation power of deep feed-forward neural networks in terms of width and depth. It is proved by construction that ReLU networks with width $\mathcal{O}\big(\max\{d\lfloor N^{1/d}\rfloor,\, N+2\}\big)$ and depth $\mathcal{O}(L)$ can approximate a H\"older continuous function on $[0,1]^d$ with an approximation rate $\mathcal{O}\big(\lambda\sqrt{d} (N^2L^2\ln N)^{-\alpha/d}\big)$, where $\alpha\in (0,1]$ and $\lambda>0$ are H\"older order and constant, respectively. Such a rate is optimal up to a constant in terms of width and depth separately, while existing results are only nearly optimal without the logarithmic factor in the approximation rate. More generally, for an arbitrary continuous function $f$ on $[0,1]^d$, the approximation rate becomes $\mathcal{O}\big(\,\sqrt{d}\,\omega_f\big( (N^2L^2\ln N)^{-1/d}\big)\,\big)$, where $\omega_f(\cdot)$ is the modulus of continuity. We also extend our analysis to any continuous function $f$ on a bounded set. Particularly, if ReLU networks with depth $31$ and width $\mathcal{O}(N)$ are used to approximate one-dimensional Lipschitz continuous functions on $[0,1]$ with a Lipschitz constant $\lambda>0$, the approximation rate in terms of the total number of parameters, $W=\mathcal{O}(N^2)$, becomes $\mathcal{O}(\tfrac{\lambda}{W\ln W})$, which has not been discovered in the literature for fixed-depth ReLU networks.


翻译:本文在宽度和深度方面集中关注深度进料神经网络的近似功率 。 建筑可以证明, 宽度为$\ mathcal{ Obig} 的ReLU 网络, 宽度為$\\ d\\ d ⁇ rform,\,\, N+2\\ big$ 和深度$\ mathcal{O} (L$), 大约是 H\\\\ oldcal 连续函数 $, 大约是 $\ macal{Obig}, 近度为$\\ lidbb{O} 近似最优, 近度为 $\ macal_ blickr_ divard} 任意连续函数 $ $0, 1, 1, 1, 1, 2L2\\\\\\\\ lid\\\\\ lideal_ liggle 。 rima\\\\\\\\\ rdeal lax a lax lade, lide, r=r=r=r=r=r=r=x lix r=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

0
下载
关闭预览

相关内容

让 iOS 8 和 OS X Yosemite 无缝切换的一个新特性。 > Apple products have always been designed to work together beautifully. But now they may really surprise you. With iOS 8 and OS X Yosemite, you’ll be able to do more wonderful things than ever before.

Source: Apple - iOS 8
专知会员服务
76+阅读 · 2021年3月16日
专知会员服务
50+阅读 · 2020年12月14日
【干货书】机器学习速查手册,135页pdf
专知会员服务
125+阅读 · 2020年11月20日
神经网络的拓扑结构,TOPOLOGY OF DEEP NEURAL NETWORKS
专知会员服务
31+阅读 · 2020年4月15日
论文浅尝 | Improved Neural Relation Detection for KBQA
开放知识图谱
13+阅读 · 2018年1月21日
已删除
将门创投
6+阅读 · 2017年11月27日
Arxiv
0+阅读 · 2021年9月20日
VIP会员
相关资讯
论文浅尝 | Improved Neural Relation Detection for KBQA
开放知识图谱
13+阅读 · 2018年1月21日
已删除
将门创投
6+阅读 · 2017年11月27日
Top
微信扫码咨询专知VIP会员