We study energy-efficient offloading strategies in a large-scale MEC system with heterogeneous mobile users and network components. The system is considered with enabled user-task handovers that capture the mobility of various mobile users. We focus on a long-run objective and online algorithms that are applicable to realistic systems. The problem is significantly complicated by the large problem size, the heterogeneity of user tasks and network components, and the mobility of the users, for which conventional optimizers cannot reach optimum with a reasonable amount of computational and storage power. We formulate the problem in the vein of the restless multi-armed bandit process that enables the decomposition of high-dimensional state spaces and then achieves near-optimal algorithms applicable to realistically large problems in an online manner. Following the restless bandit technique, we propose two offloading policies by prioritizing the least marginal costs of selecting the corresponding computing and communication resources in the edge and cloud networks. This coincides with selecting the resources with the highest energy efficiency. Both policies are scalable to the offloading problem with a great potential to achieve proved asymptotic optimality - approach optimality as the problem size tends to infinity. With extensive numerical simulations, the proposed policies are demonstrated to clearly outperform baseline policies with respect to power conservation and robust to the tested heavy-tailed lifespan distributions of the offloaded tasks.
翻译:暂无翻译