Several current bounds on the maximal number of affine regions of a ReLU feed-forward neural network are special cases of the framework [1] which relies on layer-wise activation histogram bounds. We analyze and partially solve a problem in algebraic topology the solution of which would fully exploit this framework. Our partial solution already induces slightly tighter bounds and suggests insight in how parameter initialization methods can affect the number of regions. Furthermore, we extend the framework to allow the composition of subnetwork instead of layer-wise activation histogram bounds to reduce the number of required compositions which negatively affect the tightness of the resulting bound.
翻译:RELU feed-forward 神经网络最大平方形区域数的当前几个界限是框架[1] 的特殊例子,该框架依赖层与层之间的激活直方图界限。我们分析并部分解决了代数层地形学中的一个问题,其解决办法将充分利用这一框架。我们的部分解决方案已经吸引了略微紧凑的界限,并提出了参数初始化方法如何影响区域数目的洞察力。此外,我们扩大了框架,允许子网络的组成,而不是层与层之间的激活直方图界限,以减少对由此形成的界限的紧紧性产生消极影响的必要构成的数量。