For improving short-length codes, we demonstrate that classic decoders can also be used with real-valued, neural encoders, i.e., deep-learning based codeword sequence generators. Here, the classical decoder can be a valuable tool to gain insights into these neural codes and shed light on weaknesses. Specifically, the turbo-autoencoder is a recently developed channel coding scheme where both encoder and decoder are replaced by neural networks. We first show that the limited receptive field of convolutional neural network (CNN)-based codes enables the application of the BCJR algorithm to optimally decode them with feasible computational complexity. These maximum a posteriori (MAP) component decoders then are used to form classical (iterative) turbo decoders for parallel or serially concatenated CNN encoders, offering a close-to-maximum likelihood (ML) decoding of the learned codes. To the best of our knowledge, this is the first time that a classical decoding algorithm is applied to a non-trivial, real-valued neural code. Furthermore, as the BCJR algorithm is fully differentiable, it is possible to train, or fine-tune, the neural encoder in an end-to-end fashion.
翻译:为了改进短期代码,我们证明经典解码器也可以用于真实价值的神经神经网络(NCN)编码器,即深学习的代码序列生成器。在这里,古典解码器可以成为了解这些神经代码并揭示弱点的宝贵工具。具体地说,涡轮自动解码器是一种最近开发的频道编码方案,其中以神经网络取代编码器和解码器。我们首先显示,基于共变神经网络(CNN)的有限可接受域使BCJR算法能够应用最佳地用可行的计算复杂度解码它们。这些后代代码元件的最大后代码器可以用来形成平行或连续组合的CNN编码器的古典(石化)涡轮解码器,提供近至最大的可能性(ML)解码。据我们所知,这是首次将古型解码算法应用到非三角的、真实的时尚时尚时态解码编码。此外,这种古型的CBCJ解码在尾端至尾端的电算法中是可能的。