We propose the Soft Graph Transformer (SGT), a soft-input-soft-output neural architecture designed for MIMO detection. While Maximum Likelihood (ML) detection achieves optimal accuracy, its exponential complexity makes it infeasible in large systems, and conventional message-passing algorithms rely on asymptotic assumptions that often fail in finite dimensions. Recent Transformer-based detectors show strong performance but typically overlook the MIMO factor graph structure and cannot exploit prior soft information. SGT addresses these limitations by combining self-attention, which encodes contextual dependencies within symbol and constraint subgraphs, with graph-aware cross-attention, which performs structured message passing across subgraphs. Its soft-input interface allows the integration of auxiliary priors, producing effective soft outputs while maintaining computational efficiency. Experiments demonstrate that SGT achieves near-ML performance and offers a flexible and interpretable framework for receiver systems that leverage soft priors.
翻译:我们提出了软图Transformer(SGT),一种专为MIMO检测设计的软输入-软输出神经架构。虽然最大似然(ML)检测能达到最优精度,但其指数级复杂度使其在大型系统中不可行,而传统的消息传递算法依赖于渐近假设,这些假设在有限维度下常常失效。近期基于Transformer的检测器表现出强大性能,但通常忽视了MIMO因子图结构,且无法利用先验软信息。SGT通过结合自注意力(用于编码符号子图和约束子图内的上下文依赖)与图感知交叉注意力(用于在子图间执行结构化消息传递),解决了这些局限性。其软输入接口允许集成辅助先验信息,在保持计算效率的同时产生有效的软输出。实验表明,SGT实现了接近ML的性能,并为利用软先验的接收机系统提供了一个灵活且可解释的框架。