Task offloading is a widely used technology in Mobile Edge Computing (MEC), which declines the completion time of user task with the help of resourceful edge servers. Existing works mainly focus on the case that the computation density of a user task is homogenous so that it can be offloaded in full or by percentage. However, various user tasks in real life consist of several inner dependent subtasks, each of which is a minimum execution unit logically. Motivated by this gap, we aim to solve the Dependent Task Offloading (DTO) problem under multi-user multi-edge scenario in this paper. We firstly use Directed Acyclic Graph (DAG) to represent dependent task where nodes indicate subtasks and directed edges indicate dependencies among subtasks. Then we propose a scheme based on Graph Attention Network (GAT) and Deep Reinforcement Learning (DRL) to minimize the makespan of user tasks. To utilize GAT efficiently, we put the training of it on resourceful cloud in unsupervised style due to the numerous data and computation resource requirements. In addition, we design a multi-discrete Action space for DRL algorithm to enhance the applicability of our proposed scheme. Experiments are conducted on broadly distributed synthetic data. The results demonstrate that our proposed approach can be adapted to both simple and complex MEC environments and outperforms other methods.
翻译:任务卸载是移动边缘计算(MEC)中广泛使用的技术,它可以在资源丰富的边缘服务器的帮助下减少用户任务的完成时间。现有的工作主要集中在计算密度均匀的情况下,即可以完全或按比例卸载任务。然而,现实生活中的各种用户任务由几个内部依赖子任务组成,每个子任务都是一个最小的逻辑执行单元。受到这个差距的启发,本文旨在解决多用户多边缘场景下的依赖任务卸载(DTO)问题。我们首先使用有向无环图(DAG)表示依赖任务,其中节点表示子任务,有向边表示子任务之间的依赖关系。然后,我们提出了一种基于图注意力网络(GAT)和深度强化学习(DRL)的方案,以最小化用户任务的完成时间。为了有效利用GAT,由于需要大量的数据和计算资源,我们对其进行不受监督的云端训练。此外,我们设计了多离散动作空间的DRL算法,以增强我们提出的方案的适用性。实验是在广泛分布的合成数据上进行的。结果表明,我们提出的方法可以适应简单和复杂的MEC环境,并且优于其他方法。