In Edge Computing (EC), containers have been increasingly used to deploy applications to provide mobile users services. Each container must run based on a container image file that exists locally. However, it has been conspicuously neglected by existing work that effective task scheduling combined with dynamic container image caching is a promising way to reduce the container image download time with the limited bandwidth resource of edge nodes. To fill in such gaps, in this paper, we propose novel joint Task Scheduling and Image Caching (TSIC) algorithms, specifically: 1) We consider the joint task scheduling and image caching problem and formulate it as a Markov Decision Process (MDP), taking the communication delay, waiting delay, and computation delay into consideration; 2) To solve the MDP problem, a TSIC algorithm based on deep reinforcement learning is proposed with the customized state and action spaces and combined with an adaptive caching update algorithm. 3) A real container system is implemented to validate our algorithms. The experiments show that our strategy outperforms the existing baseline approaches by 23\% and 35\% on average in terms of total delay and waiting delay, respectively.
翻译:暂无翻译