This paper considers computation offloading in fog-radio access networks (F-RAN), where multiple user equipments (UEs) offload their computation tasks to the F-RAN through a number of fog nodes. Each UE can choose one of the fog nodes to offload its task, and each fog node may serve multiple UEs. Depending on the computation burden at the fog nodes, the tasks may be computed by the fog nodes or further offloaded to the cloud via capacity-limited fronthaul links. To compute all UEs' tasks as fast as possible, joint optimization of UE-Fog association, radio and computation resources of F-RAN is proposed to minimize the maximum latency of all UEs. This min-max problem is formulated as a mixed integer nonlinear program (MINP). We first show that the MINP can be reformulated as a continuous optimization problem, and then employ the majorization minimization (MM) approach to find a solution. The MM approach that we develop is unconventional in that -- each MM subproblem can be solved inexactly with the same provable convergence guarantee as the conventional exact MM, thereby reducing the complexity of each MM iteration. In addition, we also consider a cooperative offloading model, where the fog nodes compress-and-forward their received signals to the cloud. Under this model, a similar min-max latency optimization problem is formulated and tackled again by the inexact MM approach. Simulation results show that the proposed algorithms outperform some heuristic offloading strategies, and that the cooperative offloading can better exploit the transmission diversity to attain better latency performance than the non-cooperative one.
翻译:本文考虑在雾- 无线电存取网络( F- RAN) 中计算卸载时, 多用户设备( UES) 通过多个雾节点将计算任务卸载到 F- RAN 中。 每个 UE 可以选择一个雾节点来卸载任务, 每个雾节点可以服务多个 UE。 根据雾节点的计算负担, 任务可以由雾节点计算, 或者通过容量有限的前厅链接进一步卸载到云层中。 要尽可能快速地计算所有铀的多样化任务, 提议联合优化 F- RAN 的非云层关联、 无线电和计算资源, 以最大限度地减少所有 UE- 任务的最大延缓点。 这个小麦节点问题可以作为一个混合的不线性程序( MINP ) 。 我们首先显示, MINP 可以重新配置成一个连续的优化问题, 然后使用主要最小化( MM ) 解析 方法来找到一个解决方案。 我们开发的MMD 方法是非常规的 -- 每一个亚分分点 都可以 优化的计算结果, 和计算出一个常规的精度 将一个常规的递化的递化的递化 。