It is well known that tensor network regression models operate on an exponentially large feature space, but questions remain as to how effectively they are able to utilize this space. Using the polynomial featurization from Novikov et al., we propose the interaction decomposition as a tool that can assess the relative importance of different regressors as a function of their polynomial degree. We apply this decomposition to tensor ring and tree tensor network models trained on the MNIST and Fashion MNIST datasets, and find that up to 75% of interaction degrees are contributing meaningfully to these models. We also introduce a new type of tensor network model that is explicitly trained on only a small subset of interaction degrees, and find that these models are able to match or even outperform the full models using only a fraction of the exponential feature space. This suggests that standard tensor network models utilize their polynomial regressors in an inefficient manner, with the lower degree terms being vastly under-utilized.
翻译:众所周知, 粒子网络回归模型在巨型大型地物空间运行, 但对于它们如何有效地利用这一空间仍有疑问。 我们用诺维科夫等人的多元性胚胎作用, 提议将互动分解作为一种工具, 评估不同递子体的相对重要性, 以其多元度为函数。 我们将这种分解应用到通过MNIST和时装MNIST数据集培训的抗拉环和树弹性网络模型, 并发现高达75%的互动度正在对这些模型做出有意义的贡献。 我们还引入了一种新型的高温网络模型, 仅对一小部分互动度进行明确培训, 并发现这些模型只能使用指数地物空间的一小部分来匹配甚至超越全部模型。 这表明标准抗拉网络模型以低效率的方式使用其多边递解器, 低度条件被广泛利用。