We introduce T-Basis, a novel concept for a compact representation of a set of tensors, each of an arbitrary shape, which is often seen in Neural Networks. Each of the tensors in the set is modeled using Tensor Rings, though the concept applies to other Tensor Networks. Owing its name to the T-shape of nodes in diagram notation of Tensor Rings, T-Basis is simply a list of equally shaped three-dimensional tensors, used to represent Tensor Ring nodes. Such representation allows us to parameterize the tensor set with a small number of parameters (coefficients of the T-Basis tensors), scaling logarithmically with each tensor's size in the set and linearly with the dimensionality of T-Basis. We evaluate the proposed approach on the task of neural network compression and demonstrate that it reaches high compression rates at acceptable performance drops. Finally, we analyze memory and operation requirements of the compressed networks and conclude that T-Basis networks are equally well suited for training and inference in resource-constrained environments and usage on the edge devices.
翻译:我们引入了T- Basis, 这是一种关于一组高压器(每个高压器都是任意形状的)的压缩代表的新概念, 通常在神经网络中可以看到。 集中的每个高压器都使用Tensor 环进行模拟, 虽然这个概念适用于其他Tensor 网络。 由于Tensor 环图示符号中的节点的T- shape, T- Basis 仅仅是一个相同形状的三维强器的列表, 用来代表Tensor 环节点。 这种演示让我们能够用少量参数( T- Basis 高压器的节点)来参数来对高压器进行参数化( T- Basis 高压器的节点), 以每个高压器大小的对调调法与T- Basis 的维度相匹配。 我们评估了关于神经网络压缩任务的拟议方法, 并表明它达到可接受的性能滴液压器的高压率。 最后, 我们分析压缩网络的记忆和操作要求, 并得出结论, T- Basis 网络同样适合在资源紧张环境中的环境下的培训和推断。