Adaptive filtering is a well-known problem with a wide range of applications, including echo cancellation. Extensive research during the past few decades has led to the invention of various algorithms. However, the known computationally-efficient solutions show a tradeoff between convergence speed and accuracy. Moreover, running these algorithms involves heuristically setting various parameters that considerably affect their performances. In this paper, we propose a new algorithm which we refer to as online block maximum likelihood (OBML). OBML is a computationally-efficient online learning algorithm that employs maximum likelihood (ML) estimation every $P$ samples. We fully characterize the expected performance of OBML and show that i) OBML is able to asymptotically recover the unknown coefficients and ii) its expected estimation error asymptotically converges to zero as $O({1\over t})$. We also derive an alternative version of OBML, which we refer to as incremental maximum likelihood (IML), which incrementally updates its ML estimate of the coefficients at every sample. Our simulation results verify the analytical conclusions for memoryless inputs, and also show excellent performance of both OBML and IML in an audio echo cancellation application with strongly correlated input signals.
翻译:适应性过滤是一个众所周知的问题,其应用范围很广,包括取消回声。过去几十年的广泛研究已导致各种算法的发明。然而,已知的计算效率解决方案显示,趋同速度和准确性之间有权衡取舍。此外,运行这些算法需要超自然地设定各种参数,大大影响其性能。在本文中,我们提出了一个新的算法,我们称之为在线区块最大可能性(OBML)。OBML是一种计算效率高的在线学习算法,它利用最大可能性(ML)估算每一份美元样本。我们充分描述OBML的预期性能,并表明i) OBML能够无干扰地恢复未知的系数和ii)其预期的估计误差,即以美元({1\over t}$)为零。我们还提出了一种OBML的替代版本,我们称之为递增最大可能性(IML),它逐步更新其每一份样本的系数的ML估计值。我们的模拟结果在无记忆性回音率和ML的精确度应用中,我们用无声磁带回回音频回音频信号来验证了分析结论。