One form of characterizing the expressiveness of a piecewise linear neural network is by the number of linear regions, or pieces, of the function modeled. We have observed substantial progress in this topic through lower and upper bounds on the maximum number of linear regions and a counting procedure. However, these bounds only account for the dimensions of the network and the exact counting may take a prohibitive amount of time, therefore making it infeasible to benchmark the expressiveness of networks. In this work, we approximate the number of linear regions of specific rectifier networks with an algorithm for probabilistic lower bounds of mixed-integer linear sets. In addition, we present a tighter upper bound that leverages network coefficients. We test both on trained networks. The algorithm for probabilistic lower bounds is several orders of magnitude faster than exact counting and the values reach similar orders of magnitude, hence making our approach a viable method to compare the expressiveness of such networks. The refined upper bound is particularly stronger on networks with narrow layers.
翻译:孔径线线性神经网络的清晰度的一个特征形式是模型函数的线性区域或碎块的数量。我们通过线性区域最大数和计算程序上下限观察到了这个专题的重大进展。但是,这些界限只考虑到网络的尺寸,精确的计算可能花费了令人望而却步的时间,因此无法以网络的清晰度为基准。在这项工作中,我们将特定整形网络的直线区域的数量与混合整数线性下限的概率性下限算法相近。此外,我们提出了一个较紧的上限,利用网络系数。我们在经过训练的网络上进行测试。概率性下限的算法比精确的计算要快得多,其值达到相似的量级,因此我们的做法是比较这些网络的清晰度的一个可行方法。精细的上限在窄层的网络上特别强。