We consider the centralized coded caching system where a library of files is available at the server and their subfiles are cached at the clients as prescribed by a placement delivery array (PDA). We are interested in the problem where a specific file in the library is replaced with a new file at the server, the contents of which are correlated with the file being replaced, and this change needs to be communicated to the caches. Upon replacement, the server has access only to the updated file and is unaware of its differences with the original, while each cache has access to specific subfiles of the original file as dictated by the PDA. We model the correlation between the two files by assuming that they differ in at the most $\epsilon$ subfiles, and aim to reduce the number of bits broadcast by the server to update the caches. We design a new elegant coded transmission strategy for the server to update the caches blindly, and also identify a simple scheme that is based on MDS codes. We then derive converse bounds on the minimum communication cost $\ell^*$ among all linear strategies. For two well-known families of PDAs -- Maddah-Ali & Niesen's caching scheme and a PDA by Tang & Ramamoorthy and Yan et al. -- our new scheme has cost $\ell^*(1 + o(1))$ when the updates are sufficiently sparse, while the scheme using MDS codes has order-optimal cost when the updates are dense.
翻译:我们考虑中央编码缓存系统, 服务器上有一个文件库, 其子文件会按照放置交付阵列( PDA) 的规定在客户处缓存。 我们感兴趣的问题是, 库里的具体文件会被服务器上的新文件替换, 其内容与文件被替换相关, 而这种更改需要传送到缓存中。 在替换后, 服务器只能访问更新后的文件, 并且不知道它与原始文件的区别, 而每个缓存都能够按照 PDA 的要求访问原始文件的具体子文件 。 我们以两个文件在最大 $\ epsilon$ 子文件上的差异来模拟这两个文件的关联性。 我们为服务器设计一个新的优雅的编码传输策略, 以便盲目更新缓存, 并找出一个基于 MDS$( 1) 代码的简单方案。 我们随后根据所有线性战略的最小通信成本 $\ell%。 对于两个众所周知的 PDA & 和 NADA 计划的家庭来说, 正在使用新的 RMADA 和 AS 系统, 正在使用新的 Rental- made- Madah & am- hateal 和 amal- a AS a am- hindown the the a dash- made- made- am- am- am- am- am- addal- add- add- am- addal- add- add- addal- add- add- addal- addalsaldaldaldaldald和 add- add和 add- add- add- add- add- add- add- am- am- am- am- am- add- addals- addaldals- amdaldaldaldaldaldaldaldaldaldaldaldaldaldal- am- am- am- am- am- am- am- addald- addaldaldaldaldald- amd- add-