Broadcast/multicast communication systems are typically designed to optimize the outage rate criterion, which neglects the performance of the fraction of clients with the worst channel conditions. Targeting ultra-reliable communication scenarios, this paper takes a complementary approach by introducing the conditional value-at-risk (CVaR) rate as the expected rate of a worst-case fraction of clients. To support differential quality-of-service (QoS) levels in this class of clients, layered division multiplexing (LDM) is applied, which enables decoding at different rates. Focusing on a practical scenario in which the transmitter does not know the fading distribution, layer allocation is optimized based on a dataset sampled during deployment. The optimality gap caused by the availability of limited data is bounded via a generalization analysis, and the sample complexity is shown to increase as the designated fraction of worst-case clients decreases. Considering this theoretical result, meta-learning is introduced as a means to reduce sample complexity by leveraging data from previous deployments. Numerical experiments demonstrate that LDM improves spectral efficiency even for small datasets; that, for sufficiently large datasets, the proposed mirror-descent-based layer optimization scheme achieves a CVaR rate close to that achieved when the transmitter knows the fading distribution; and that meta-learning can significantly reduce data requirements.
翻译:广播/多频通信系统通常设计为优化断流率标准,该标准忽视了频道条件最差的客户部分的性能; 以超可靠通信情景为目标,本文件采取补充性做法,采用条件值风险(CVaR)率作为最坏客户部分的预期比率; 支持这一类客户的服务质量差异(QOS)水平,采用分层多路转换(LDM),从而能够以不同比率解码。 侧重于发报器不知道淡化分布的实用情景,根据部署期间抽样的数据集优化层分配。 有限数据的可用性造成的最佳性差距通过一般化分析被捆绑起来,样本复杂性则随着指定最坏客户部分的减少而增加。 考虑到这一理论结果,采用元学习作为一种手段,通过利用先前部署的数据来降低抽样复杂性。 数值实验表明,即使小型数据集的光谱效率也得到了提高; 足够大的数据元化的分布率,在最接近的C层时,则能够实现最大幅度的C级分配率。