Deep neural network (DNN)-assisted channel coding designs, such as low-complexity neural decoders for existing codes, or end-to-end neural-network-based auto-encoder designs are gaining interest recently due to their improved performance and flexibility; particularly for communication scenarios in which high-performing structured code designs do not exist. Communication in the presence of feedback is one such communication scenario, and practical code design for feedback channels has remained an open challenge in coding theory for many decades. Recently, DNN-based designs have shown impressive results in exploiting feedback. In particular, generalized block attention feedback (GBAF) codes, which utilizes the popular transformer architecture, achieved significant improvement in terms of the block error rate (BLER) performance. However, previous works have focused mainly on passive feedback, where the transmitter observes a noisy version of the signal at the receiver. In this work, we show that GBAF codes can also be used for channels with active feedback. We implement a pair of transformer architectures, at the transmitter and the receiver, which interact with each other sequentially, and achieve a new state-of-the-art BLER performance, especially in the low SNR regime.
翻译:深神经网络(DNN)协助的深神经网络(DNN)的频道编码设计,例如现有代码的低复杂神经解码器或终端到终端神经网络的自动编码设计,最近由于其性能和灵活性的提高而越来越引起人们的兴趣;特别是对于没有高性能结构代码设计的通信设想,特别是对于没有高性能结构代码设计的通信设想,在有反馈的情况下进行通信是一种这种通信设想,反馈渠道的实用代码设计数十年来在编码理论中一直是一个公开的挑战。最近,基于DNN的设计的设计在利用反馈方面显示出令人印象深刻的结果。特别是,利用流行变异器结构的普遍的阻力关注反馈(GBAF)代码,在区块错误率(GLR)性能方面取得了显著的改进。然而,以前的工程主要侧重于被动反馈,在接收器上观测信号的噪音版本。在这项工作中,我们表明GBAF代码也可以用于有积极反馈的频道。我们在发射机和接收器实施一对变换结构,这些结构是相互连续互动的,特别是在低核反应堆中实现新的状态。