Sparse Bayesian learning (SBL) has emerged as a fast and competitive method to perform sparse processing. The SBL algorithm, which is developed using a Bayesian framework, approximately solves a non-convex optimization problem using fixed point updates. It provides comparable performance and is significantly faster than convex optimization techniques used in sparse processing. We propose a signal model which accounts for dictionary mismatch and the presence of errors in the weight vector at low signal-to-noise ratios. A fixed point update equation is derived which incorporates the statistics of mismatch and weight errors. We also process observations from multiple dictionaries. Noise variances are estimated using stochastic maximum likelihood. The derived update equations are studied quantitatively using beamforming simulations applied to direction-of-arrival (DoA). Performance of SBL using single- and multi-frequency observations, and in the presence of aliasing, is evaluated. SwellEx-96 experimental data demonstrates qualitatively the advantages of SBL.
翻译:使用巴伊西亚框架开发的SBL算法,大约用固定点更新解决非convex优化问题。它提供可比较的性能,大大快于在稀疏处理中使用的convex优化技术。我们提出了一个信号模型,用于计算字典错配和低信号到噪音比率重量矢量中存在错误。一个固定点更新方程式,包含不匹配和重量错误的统计。我们还处理多个字典的观测。噪音差异是使用随机最大可能性估计的。衍生的更新方程式是利用用于方向到目的地(DoA)的波形模拟进行定量研究的。我们建议了一个信号模型,用于计算字典错配和低信号到噪音比率的重量矢量存在错误。SwellEx-96实验数据显示SBL的优点质量。