To support future 6G mobile applications, the mobile edge computing (MEC) network needs to be jointly optimized for computing, pushing, and caching to reduce transmission load and computation cost. To achieve this, we propose a framework based on deep reinforcement learning that enables the dynamic orchestration of these three activities for the MEC network. The framework can implicitly predict user future requests using deep networks and push or cache the appropriate content to enhance performance. To address the curse of dimensionality resulting from considering three activities collectively, we adopt the soft actor-critic reinforcement learning in continuous space and design the action quantization and correction specifically to fit the discrete optimization problem. We conduct simulations in a single-user single-server MEC network setting and demonstrate that the proposed framework effectively decreases both transmission load and computing cost under various configurations of cache size and tolerable service delay.
翻译:暂无翻译