Small neural networks (NNs) used for error correction were shown to improve on classic channel codes and to address channel model changes. We extend the code dimension of any such structure by using the same NN under one-hot encoding multiple times, then serially-concatenated with an outer classic code. We design NNs with the same network parameters, where each Reed-Solomon codeword symbol is an input to a different NN. Significant improvements in block error probabilities for an additive Gaussian noise channel as compared to the small neural code are illustrated, as well as robustness to channel model changes.
翻译:本文提出了一种新的结构,即将多个使用相同神经网络的One-Hot编码串联起来,再与外部的经典码串联,从而提高了代码的维度。与小型神经网络代码相比,序列连接经典代码和多个神经网络的代码在加性高斯噪声通道中的块误码率实现了显著改善,并且对通道模型的变化具有鲁棒性。其中,每一个Reed-Solomon码字符都作为一个输入传递给不同的神经网络,这些神经网络将具有相同的网络参数。