Self size-estimating feedforward network (SSFN) is a feedforward multilayer network. For the existing SSFN, a part of each weight matrix is trained using a layer-wise convex optimization approach (a supervised training), while the other part is chosen as a random matrix instance (an unsupervised training). In this article, the use of deterministic transforms instead of random matrix instances for the SSFN weight matrices is explored. The use of deterministic transforms provides a reduction in computational complexity. The use of several deterministic transforms is investigated, such as discrete cosine transform, Hadamard transform, Hartley transform, and wavelet transforms. The choice of a deterministic transform among a set of transforms is made in an unsupervised manner. To this end, two methods based on features' statistical parameters are developed. The proposed methods help to design a neural net where deterministic transforms can vary across its layers' weight matrices. The effectiveness of the proposed approach vis-a-vis the SSFN is illustrated for object classification tasks using several benchmark datasets.
翻译:自我估计尺寸的进料网络(SSFN)是一个向前的多层网络。 对于现有的 SSFN 来说, 每一个重量矩阵的一部分都是用一个分层法优化方式来训练的( 受监督的培训), 而另一部分则被选为随机矩阵实例( 不受监督的培训 ) 。 在本条中, 对 SSFN 重量矩阵使用确定性变异而不是随机矩阵实例 进行了探索 。 确定性变异的使用减少了计算的复杂性 。 对多种确定性变异的使用进行了调查, 例如离散 Cosine 变换、 Hadamard 变换、 Hartley 变换和波盘变。 在一组变换中选择确定性变换是以一种不受监督的方式进行的。 为此, 开发了两种基于特征统计参数的方法 。 拟议的方法有助于设计神经网, 确定性变变可以在其层次的重量矩阵中发生变化。 提议的方法相对于 SSFNFNFM 的有效性是用几个基准数据集用于对象分类任务的说明 。