While constructing polar codes for successive-cancellation decoding can be implemented efficiently by sorting the bit-channels, finding optimal polar-code constructions for the successive-cancellation list (SCL) decoding in an efficient and scalable manner still awaits investigation. This paper proposes a graph neural network (GNN)-based reinforcement learning algorithm, named the iterative message-passing (IMP) algorithm, to solve the polar-code construction problem for SCL decoding. The algorithm operates only on the local structure of the graph induced by polar-code's generator matrix. The size of the IMP model is independent of the blocklength and the code rate, making it scalable to construct polar codes with long blocklengths. Moreover, a single trained IMP model can be directly applied to a wide range of target blocklengths, code rates, and channel conditions, and corresponding polar codes can be generated without separate training. Numerical experiments show that the IMP algorithm finds polar-code constructions that significantly outperform the classical constructions under cyclic-redundancy-check-aided SCL (CA-SCL) decoding. Compared to other learning-based construction methods tailored to SCL/CA-SCL decoding, the IMP algorithm constructs polar codes with comparable or lower frame error rates, while reducing the training complexity significantly by eliminating the need of separate training at each target blocklength, code rate, and channel condition.
翻译:在为连续取消分解解码建立极地代码的同时,通过对位码生成器矩阵对位码解码进行分解,可以有效地实施极地代码,同时为连续取消列表找到最佳极地代码结构,以高效和可缩放的方式进行分解,目前仍在调查中。本文件提议了一个基于图形神经网络(GNN)的强化学习算法(GNN),名为迭接信息传递算法(IMP),用于解决在SCL解码过程中的极地代码构建问题。算法只有在由极地码生成源矩阵引出的图形的本地结构中运作。 IMP 模型的大小独立于块宽度和密码率,使得能够以长块长的长度构建极地代码。此外,一个经过培训的单一的IMP模型(GNNN)可以直接应用于广泛的目标区块长度、代码传输率和频道条件以及相应的极地代码生成,而无需另行培训。 数值实验显示,IMP的算法发现极地代码结构大大超越了在环流-重复校准和分解复杂结构下的典型建筑结构结构结构结构结构,而需要通过比较-CL-CA-CAS-CAS-C-C-C-C-C-CAS-C-CAS-C-C-CL(CL-C-CAS-C-C-C-C-CL-CL-C-C-C-C-C-C-C-C-C-CL-CL-C-C-C-CL-CL-CL-CL-CL-CL-CL-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-CL-CL-CL-D-D-C-D-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-