In this work, we propose a fully differentiable graph neural network (GNN)-based architecture for channel decoding and showcase competitive decoding performance for various coding schemes, such as low-density parity-check (LDPC) and BCH codes. The idea is to let a neural network (NN) learn a generalized message passing algorithm over a given graph that represents the forward error correction (FEC) code structure by replacing node and edge message updates with trainable functions. Contrary to many other deep learning-based decoding approaches, the proposed solution enjoys scalability to arbitrary block lengths and the training is not limited by the curse of dimensionality. We benchmark our proposed decoder against state-of-the-art in conventional channel decoding as well as against recent deep learning-based results. For the (63,45) BCH code, our solution outperforms weighted belief propagation (BP) decoding by approximately 0.4 dB with significantly less decoding iterations and even for 5G NR LDPC codes, we observe a competitive performance when compared to conventional BP decoding. For the BCH codes, the resulting GNN decoder can be fully parametrized with only 9640 weights.
翻译:在这项工作中,我们提出一个完全不同的图形神经网络(GNN)结构,用于为各种编码计划,例如低密度对等检查(LDPC)和BCH编码,进行频道解码和展示竞争性解码性功能。我们的想法是让神经网络(NN)学习一个通用的信息传算法,在某一图中代表远方错误校正(FEC)代码结构,用可训练功能取代节点和边端信息更新。与许多其他深层次的基于学习的解码方法相反,拟议的解决方案可伸缩到任意的区块长度,培训不受维度诅咒的限制。我们用传统频道解码以及最近的深层次学习结果来衡量我们提议的解码程序。关于(63,45)BCH代码,我们的解决方案超越了加权信仰传播(BP),由大约0.4 dB解码解码,甚至5G LDPC代码,我们观察到与传统BPDNC代码相比具有竞争性的性表现,而BPDNC只产生完全的重量。