随着5G的不断发展，万物互联时代即将来临。在万物互联的背景下，海量设备连接、海量业务请求、超高网络负载、复杂动态的网络环境等问题对5G系统优化提出了巨大的挑战。面对这些技术难点，人工智能（AI）算法表现出了其独特的优势。首先对5G系统中基于深度学习的AI算法相比于传统算法的优势进行介绍；随后，针对多址边缘计算和毫米波大规模多输入多输出（mmWave Massive MIMO）系统中的AI算法应用进行详细的阐述并对比分析了各种方法的优劣势；最后，根据已有研究，总结了AI算法在5G实际场景中存在的不足，并对未来研究方向提出了展望。
Data augmentation is becoming essential for improving regression accuracy in critical applications including manufacturing and finance. Existing techniques for data augmentation largely focus on classification tasks and do not readily apply to regression tasks. In particular, the recent Mixup techniques for classification rely on the key assumption that linearity holds among training examples, which is reasonable if the label space is discrete, but has limitations when the label space is continuous as in regression. We show that mixing examples that either have a large data or label distance may have an increasingly-negative effect on model performance. Hence, we use the stricter assumption that linearity only holds within certain data or label distances for regression where the degree may vary by each example. We then propose MixRL, a data augmentation meta learning framework for regression that learns for each example how many nearest neighbors it should be mixed with for the best model performance using a small validation set. MixRL achieves these objectives using Monte Carlo policy gradient reinforcement learning. Our experiments conducted both on synthetic and real datasets show that MixRL significantly outperforms state-of-the-art data augmentation baselines. MixRL can also be integrated with other classification Mixup techniques for better results.