Statistical signal processing applications usually require the estimation of some parameters of interest given a set of observed data. These estimates are typically obtained either by solving a multi-variate optimization problem, as in the maximum likelihood (ML) or maximum a posteriori (MAP) estimators, or by performing a multi-dimensional integration, as in the minimum mean squared error (MMSE) estimators. Unfortunately, analytical expressions for these estimators cannot be found in most real-world applications, and the Monte Carlo (MC) methodology is one feasible approach. MC methods proceed by drawing random samples, either from the desired distribution or from a simpler one, and using them to compute consistent estimators. The most important families of MC algorithms are Markov chain MC (MCMC) and importance sampling (IS). On the one hand, MCMC methods draw samples from a proposal density, building then an ergodic Markov chain whose stationary distribution is the desired distribution by accepting or rejecting those candidate samples as the new state of the chain. On the other hand, IS techniques draw samples from a simple proposal density, and then assign them suitable weights that measure their quality in some appropriate way. In this paper, we perform a thorough review of MC methods for the estimation of static parameters in signal processing applications. A historical note on the development of MC schemes is also provided, followed by the basic MC method and a brief description of the rejection sampling (RS) algorithm, as well as three sections describing many of the most relevant MCMC and IS algorithms, and their combined use.
翻译:统计信号处理应用通常需要根据一组观测数据估计一些感兴趣的参数。这些估计通常通过解决多变优化问题(如最大可能性(ML)或最高后端估计器(MAP),或进行多维整合(如最小平均平方误差(MMSE)估计器)。不幸的是,这些估计器的分析表达方式无法在大多数现实应用中找到,而蒙特卡洛(MC)方法是一种可行的方法。MC的方法是随机抽样,从理想的分布或更简单的分布中抽取,并用它们来计算一致的估测器。MC算法的最重要组系是Markov链(MC)和重要性取样。一方面,MC方法从建议密度最小的最小平方差(MS)中抽取样本,然后通过接受或拒绝这些候选样本作为新的链状态进行分布。另一方面,IS技术从简单的建议密度或更简单的分布中抽取一些样本,然后用它们用来计算前后一致的估测测测测测标值。 MC方法中,我们用其最适当的标度方法进行一个相关的标度评估。