The Hadamard product of tensor train (TT) tensors is a fundamental nonlinear operation in scientific computing and data analysis. However, due to its tendency to significantly increase TT ranks, the Hadamard product poses a major computational challenge in TT tensor-based algorithms. To address this, it is crucial to develop recompression algorithms that mitigate the effects of this rank increase. Existing recompression algorithms require an explicit representation of the Hadamard product, resulting in high computational and storage costs. In this work, we propose a Hadamard avoiding TT recompression (HaTT) algorithm, which reduces both computational complexity and storage requirements. By leveraging the structure of the Hadamard product in TT tensors and exploiting its Hadamard product-free property, the HaTT algorithm achieves significantly lower complexity compared to existing TT recompression methods. This is confirmed through both complexity analysis and numerical experiments. Furthermore, the HaTT algorithm is applied to solve the Allen--Cahn equation, achieving substantial speedup over existing TT recompression algorithms without sacrificing accuracy.
翻译:暂无翻译