Monitoring the Covid19 pandemic constitutes a critical societal stake that received considerable research efforts. The intensity of the pandemic on a given territory is efficiently measured by the reproduction number, quantifying the rate of growth of daily new infections. Recently, estimates for the time evolution of the reproduction number were produced using an inverse problem formulation with a nonsmooth functional minimization. While it was designed to be robust to the limited quality of the Covid19 data (outliers, missing counts), the procedure lacks the ability to output credibility interval based estimates. This remains a severe limitation for practical use in actual pandemic monitoring by epidemiologists that the present work aims to overcome by use of Monte Carlo sampling. After interpretation of the nonsmooth functional into a Bayesian framework, several sampling schemes are tailored to adjust the nonsmooth nature of the resulting posterior distribution. The originality of the devised algorithms stems from combining a Langevin Monte Carlo sampling scheme with Proximal operators. Performance of the new algorithms in producing relevant credibility intervals for the reproduction number estimates and denoised counts are compared. Assessment is conducted on real daily new infection counts made available by the Johns Hopkins University. The interest of the devised monitoring tools are illustrated on Covid19 data from several different countries.
翻译:监测Covid19大流行是一个重大的社会利害关系,得到了相当多的研究努力;该流行病在特定领土上的强度以复制数来有效地衡量,通过量化每日新感染病例的增长速度来量化;最近,利用一种不完全功能最小化的反问题配方来估计复制数字的时间演变情况;虽然该程序的设计是为了适应Covid19数据(外线、缺失计数)的有限质量,但程序缺乏产出可信度间隔估计数的能力;这仍然严重限制了流行病学家实际利用目前工作旨在通过使用Monte Carlo取样来克服的流行病监测实际应用;在将非显性功能解释成Bayesian框架之后,为调整由此产生的后部分布的非显性而制定了若干抽样计划;设计算法的初衷在于将Langevin Monte Carlo抽样计划与Proximal操作者结合起来;在制作复制数字估计和消化计数的相关可信度间隔方面,新算法的绩效是严重限制的;对目前工作旨在通过使用Monte Carlo取样来克服的流行病学家进行实际监测;对约翰·霍普斯大学提供的一些监测工具进行了实际的每日新的感染计数的评估。</s>