Tensor algebra is essential for data-intensive workloads in various computational domains. Computational scientists face a trade-off between the specialization degree provided by dense tensor algebra and the algorithmic efficiency that leverages the structure provided by sparse tensors. This paper presents StructTensor, a framework that symbolically computes structure at compilation time. This is enabled by Structured Tensor Unified Representation (STUR), an intermediate language that can capture tensor computations as well as their sparsity and redundancy structures. Through a mathematical view of lossless tensor computations, we show that our symbolic structure computation and the related optimizations are sound. Finally, for different tensor computation workloads and structures, we experimentally show how capturing the symbolic structure can result in outperforming state-of-the-art frameworks for both dense and sparse tensor algebra.
翻译:Tensor 代数对于不同计算域的数据密集型工作量至关重要。 计算科学家面临着由密度高的高温代数提供的专业程度与利用稀疏的高压结构提供的结构的算法效率之间的权衡。 本文展示了 StructTensor, 这是一个在编译时象征性地计算结构的框架。 这是由结构化的Tensor 统一代表制(STUR) 促成的, 这是一种中间语言, 它可以捕捉高压计算以及它们的宽度和冗余结构。 通过对无损失的高压计算进行数学观察, 我们显示我们的符号结构计算和相关的优化是健全的。 最后, 对于不同的高压计算工作量和结构, 我们实验性地展示了捕捉符号结构如何导致密度和稀散的高压代数的超状态框架。