Edge caching plays an increasingly important role in boosting user content retrieval performance while reducing redundant network traffic. The effectiveness of caching ultimately hinges on the accuracy of predicting content popularity in the near future. However, at the network edge, content popularity can be extremely dynamic due to diverse user content retrieval behaviors and the low-degree of user multiplexing. It's challenging for the traditional reactive caching systems to keep up with the dynamic content popularity patterns. In this paper, we propose a novel Predictive Edge Caching (PEC) system that predicts the future content popularity using fine-grained learning models that mine sequential patterns in user content retrieval behaviors, and opportunistically prefetches contents predicted to be popular in the near future using idle network bandwidth. Through extensive experiments driven by real content retrieval traces, we demonstrate that PEC can adapt to highly dynamic content popularity, and significantly improve cache hit ratio and reduce user content retrieval latency over the state-of-art caching policies. More broadly, our study demonstrates that edge caching performance can be boosted by deep mining of user content retrieval behaviors.
翻译:在提高用户内容检索性能的同时减少冗余网络流量方面, 边缘的封存在提高用户内容检索性能方面发挥着越来越重要的作用。 封存的效果最终取决于近期内对内容受欢迎程度的预测的准确性。 但是,在网络边缘,由于用户内容检索行为的多样性和用户多路传输的低度,内容受欢迎程度可能极具活力。 传统反应式封存系统要跟上动态内容受欢迎模式的挑战性。 在本文中, 我们提出了一个新型的“ 预知 Edge Caching (PEC) ” 系统, 用来预测未来内容受欢迎程度, 使用精密的学习模式, 来探测用户内容检索行为中的连续模式, 以及预计近期内将使用闲置网络带宽的随机预设内容内容。 我们通过由真实内容检索痕迹驱动的广泛实验, 证明PEC 能够适应高度动态内容受欢迎程度, 大大改善缓存率, 并减少用户对州级缓存政策的检索性。 更广义地说, 我们的研究显示, 边端的缓存性性表现可以通过深入挖掘用户内容检索行为来加速。