High-dimensional data in the form of tensors are challenging for kernel classification methods. To both reduce the computational complexity and extract informative features, kernels based on low-rank tensor decompositions have been proposed. However, what decisive features of the tensors are exploited by these kernels is often unclear. In this paper we propose a novel kernel that is based on the Tucker decomposition. For this kernel the Tucker factors are computed based on re-weighting of the Tucker matrices with tuneable powers of singular values from the HOSVD decomposition. This provides a mechanism to balance the contribution of the Tucker core and factors of the data. We benchmark support tensor machines with this new kernel on several datasets. First we generate synthetic data where two classes differ in either Tucker factors or core, and compare our novel and previously existing kernels. We show robustness of the new kernel with respect to both classification scenarios. We further test the new method on real-world datasets. The proposed kernel has demonstrated a higher test accuracy than the state-of-the-art tensor train multi-way multi-level kernel, and a significantly lower computational time.
翻译:为降低计算复杂性和提取信息性功能,提出了基于低声高压分解的内核。然而,这些内核的决定性特性往往不明确。在本文件中,我们提议了一个基于塔克分解的新型内核。对于这个内核,塔克系数的计算依据是塔克矩阵的重新加权,并使用HOSVD分解的可调的单值功率。这为平衡塔克核心的贡献和数据因素提供了一个机制。我们用这个新内核对阵列的支持度与若干数据集的新内核进行基准测量。首先,我们生成了在塔克因素或核心中存在两种差异的合成数据,并将我们的新内核与以前存在的内核作比较。我们用两种分类假设来显示新内核的坚固度。我们进一步测试了真实世界数据集的新方法。拟议的内核显示的测试精度高于州级、州级、州级、州级、州级、州级、州级、州级、多层火车的多层。