In this paper, we consider the problem of joint beam selection and link activation across a set of communication pairs to effectively control the interference between communication pairs via inactivating part communication pairs in ultra-dense device-to-device (D2D) mmWave communication networks. The resulting optimization problem is formulated as an integer programming problem that is nonconvex and NP-hard. Consequently, the global optimal solution, even the local optimal solution, cannot be generally obtained. To overcome this challenge, this paper resorts to design a deep learning architecture based on graph neural network to finish the joint beam selection and link activation, with taking the network topology information into account. Meanwhile, we present an unsupervised Lagrangian dual learning framework to train the parameters of the GBLinks model. Numerical results show that the proposed GBLinks model can converges to a stable point with the number of iterations increases, in terms of the weighted sum rate. Furthermore, the GBLinks model can reach near-optimal solution through comparing with the exhaustive search scheme in small-scale ultra-dense D2D mmWave communication networks and outperforms GreedyNoSched and the SCA-based method. It also shows that the GBLinks model can generalize to varying densities and coverage regions of ultra-dense D2D mmWave communication networks.
翻译:在本文中,我们考虑了在一组通信配对中联合选择光束和连接激活的问题,以有效控制通信配对之间通过超常设备设备到设备(D2D) mmWave 通信网络中不活跃部分通信配对对的干扰。由此产生的优化问题被描述为一个非 convex 和 NP-hard 的整数编程问题。因此,全球最佳解决方案,甚至地方最佳解决方案,一般都无法获得。为了克服这一挑战,本文件采用基于图形神经网络的深层学习结构,以完成联合光束选择和链接激活,同时考虑网络表层信息。同时,我们提出了一个不受监督的Lagrangian双学框架,以培训GBinks模型的参数。数值结果表明,拟议的GBLinks模型可以与升幅数增加的模型总和率相一致。此外,GBLinks模型可以通过将小型超市域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域域比,从而展示显示超甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚