Context data is in demand more than ever with the rapid increase in the development of many context-aware Internet of Things applications. Research in context and context-awareness is being conducted to broaden its applicability in light of many practical and technical challenges. One of the challenges is improving performance when responding to large number of context queries. Context Management Platforms that infer and deliver context to applications measure this problem using Quality of Service (QoS) parameters. Although caching is a proven way to improve QoS, transiency of context and features such as variability, heterogeneity of context queries pose an additional real-time cost management problem. This paper presents a critical survey of state-of-the-art in adaptive data caching with the objective of developing a body of knowledge in cost- and performance-efficient adaptive caching strategies. We comprehensively survey a large number of research publications and evaluate, compare, and contrast different techniques, policies, approaches, and schemes in adaptive caching. Our critical analysis is motivated by the focus on adaptively caching context as a core research problem. A formal definition for adaptive context caching is then proposed, followed by identified features and requirements of a well-designed, objective optimal adaptive context caching strategy.
翻译:随着许多符合环境需要的互联网应用的开发量的迅速增加,对背景数据的需求比以往更加迫切。在背景和背景意识方面正在进行研究,以根据许多实际和技术挑战扩大其适用性。挑战之一是在应对大量背景问题时提高业绩; 使用服务质量参数来推断和提供应用背景数据以衡量这一问题的背景管理平台。虽然缓存是改进质量标准的一个证明方法,但背景的短暂性和特征,例如变异性、背景查询的异质性,带来了额外的实时成本管理问题。本文对适应性数据的最新水平进行严格调查,目的是开发一套关于成本和业绩高效的适应性缓冲战略的知识。我们全面调查大量研究出版物,对适应性缓存的不同技术、政策、方法和计划进行对比、比较和对比。我们的关键分析的动机是侧重于适应性缓存环境,将其作为一个核心研究问题。随后提出了适应性环境缓存的正式定义,并随后提出了一个最佳的适应性目标和战略。