The paper addresses a sequential changepoint detection problem, assuming that the duration of change may be finite and unknown. This problem is of importance for many applications, e.g., for signal and image processing where signals appear and disappear at unknown points in time or space. In contrast to the conventional optimality criterion in quickest change detection that requires minimization of the expected delay to detection for a given average run length to a false alarm, we focus on a reliable maximin change detection criterion of maximizing the minimal probability of detection in a given time (or space) window for a given local maximal probability of false alarm in the prescribed window. We show that the optimal detection procedure is a modified CUSUM procedure. We then compare operating characteristics of this optimal procedure with popular in engineering the Finite Moving Average (FMA) detection algorithm and the ordinary CUSUM procedure using Monte Carlo simulations, which show that typically the later algorithms have almost the same performance as the optimal one. At the same time, the FMA procedure has a substantial advantage -- independence to the intensity of the signal, which is usually unknown. Finally, the FMA algorithm is applied to detecting faint streaks of satellites in optical images.
翻译:本文涉及一个顺序变化点探测问题,假设变化的时间可能是有限和未知的。这个问题对于许多应用都很重要,例如信号和图像处理,信号在时间或空间的未知点出现并消失。与最快速变化探测的常规最佳性标准相比,即要求尽可能减少在某一平均运行时间长度到假警报时所预期的检测延迟,我们侧重于一个可靠的最大变化探测标准,即在指定窗口的某一特定时间(或空间)窗口中将发现假警报的最小概率最大化最大化。我们表明,最佳探测程序是经过修改的CUSUM程序。我们然后将这一最佳程序的操作特点与在工程中流行的Finite move 平均检测算法和使用Monte Carlo模拟的普通 CUSUM程序进行比较,后者显示,通常后期算法与最佳算法几乎一样。同时,FMA程序具有很大的优势 -- -- 与通常未知的信号强度是独立的。最后,FMA算法用于探测光学图像中卫星的微弱记录。