There have been significant research activities in recent years to automate the design of channel encoders and decoders via deep learning. Due the dimensionality challenge in channel coding, it is prohibitively complex to design and train relatively large neural channel codes via deep learning techniques. Consequently, most of the results in the literature are limited to relatively short codes having less than 100 information bits. In this paper, we construct ProductAEs, a computationally efficient family of deep-learning driven (encoder, decoder) pairs, that aim at enabling the training of relatively large channel codes (both encoders and decoders) with a manageable training complexity. We build upon the ideas from classical product codes, and propose constructing large neural codes using smaller code components. More specifically, instead of directly training the encoder and decoder for a large neural code of dimension $k$ and blocklength $n$, we provide a framework that requires training neural encoders and decoders for the code parameters $(n_1,k_1)$ and $(n_2,k_2)$ such that $n_1 n_2=n$ and $k_1 k_2=k$. Our training results show significant gains, over all ranges of signal-to-noise ratio (SNR), for a code of parameters $(225,100)$ and a moderate-length code of parameters $(441,196)$, over polar codes under successive cancellation (SC) decoder. Moreover, our results demonstrate meaningful gains over Turbo Autoencoder (TurboAE) and state-of-the-art classical codes. This is the first work to design product autoencoders and a pioneering work on training large channel codes.
翻译:近年来,开展了大量研究活动,通过深层学习使频道编码和解码器设计自动化。由于频道编码的维度挑战,设计和通过深层学习技术培训相对较大的神经频道代码非常复杂。因此,文献中的大多数结果仅限于较短的代码,信息比特不足100美元。在本文件中,我们构建了由深层学习驱动的计算高效的系统产品AE(读码、解码器)双对(计算效率高的),目的是以可管理的培训复杂度来培训相对较大的频道代码(编码和解码器)。我们根据古典产品代码的理念,建议使用较小的代码组件来构建大型神经频道代码。更具体地说,我们直接培训编码和解码的编码,我们提供了一个框架,要求对代码参数(n_1、k_1美元)和美元以上参数(n_2美元)的系统参数(n_2美元)进行系统化参数培训。 我们的自动解码的计算结果显示在1美元、n_2美元范围内的计算结果。