Millimeter-wave (mmWave) communication systems rely on narrow beams for achieving sufficient receive signal power. Adjusting these beams is typically associated with large training overhead, which becomes particularly critical for highly-mobile applications. Intuitively, since optimal beam selection can benefit from the knowledge of the positions of communication terminals, there has been increasing interest in leveraging position data to reduce the overhead in mmWave beam prediction. Prior work, however, studied this problem using only synthetic data that generally does not accurately represent real-world measurements. In this paper, we investigate position-aided beam prediction using a real-world large-scale dataset to derive insights into precisely how much overhead can be saved in practice. Furthermore, we analyze which machine learning algorithms perform best, what factors degrade inference performance in real data, and which machine learning metrics are more meaningful in capturing the actual communication system performance.
翻译:毫米波( mmWave) 通信系统依靠狭窄的波束来实现足够的接收信号电力。 调整这些波束通常与大型培训管理费用相关, 这对于高移动性应用尤其重要。 自然地, 由于最佳的波束选择可以从对通信终端位置的了解中获益, 人们越来越有兴趣利用定位数据来减少毫米Wave波束预测中的管理费用。 但是, 先前的工作只是使用通常不准确反映真实世界测量的合成数据来研究这一问题。 在本文中, 我们使用一个真实世界大型数据集来调查定位辅助波束预测, 以洞察实际能保存多少管理费用。 此外, 我们分析哪些机器学习算法最有效, 哪些因素会降低真实数据的推断性能, 哪些机器学习尺度在捕捉实际通信系统性能方面更有意义 。