Tensor Decomposition Networks(TDNs) prevail for their inherent compact architectures. For providing convenience, we present a toolkit named TedNet that is based on the Pytorch framework, to give more researchers a flexible way to exploit TDNs. TedNet implements 5 kinds of tensor decomposition(i.e., CANDECOMP/PARAFAC(CP), Block-Term Tucker(BT), Tucker-2, Tensor Train(TT) and Tensor Ring(TR)) on traditional deep neural layers, the convolutional layer and the fully-connected layer. By utilizing these basic layers, it is simple to construct a variety of TDNs like TR-ResNet, TT-LSTM, etc. TedNet is available at https://github.com/tnbar/tednet.
翻译:Tensor分解网络(TDNs)在其固有的紧凑结构中占上风。为了方便起见,我们提出一个名为TedNet的工具包,该工具包以Pytorch框架为基础,为更多的研究人员提供一个灵活的方式开发TDNs。TedNet实施5种高压分解(如CANDECOMP/PARAFAC(CP)、Block-term Tucker(BT)、Tucker-2、Tensor Train(TTT)和Tensor Ring(TR),涉及传统的深神经层、连带层和完全相连的层。利用这些基本层,很容易建立各种TDNs,如TR-ResNet、TTT-LSTM等。 TedNet可在https://github.com/tnbar/tednet上查阅。