自然语言工程(Natural Language Engineering)满足了自动语言处理各个领域的专业人员和研究人员的需求,无论是从理论还是语料库语言学、翻译、词典编纂、计算机科学还是工程学的角度。其目的是在传统的计算语言学研究和实际应用之间架起一座桥梁。除了出版关于广泛主题的原创研究文章——从文本分析、机器翻译、信息检索、语音处理和生成到集成系统和多模态接口——它还出版关于特定自然语言处理方法、任务或应用程序的特刊。 官网地址:http://dblp.uni-trier.de/db/journals/nle/

最新论文

Approximate message passing (AMP) is a low-cost iterative parameter-estimation technique for certain high-dimensional linear systems with non-Gaussian distributions. However, AMP only applies to the independent identically distributed (IID) transform matrices, but may become unreliable (e.g. perform poorly or even diverge) for other matrix ensembles, especially for ill-conditioned ones. To handle this difficulty, orthogonal/vector AMP (OAMP/VAMP) was proposed for general bi-unitarily-invariant matrices. However, the Bayes-optimal OAMP/VAMP requires high-complexity linear minimum mean square error (MMSE) estimator. This limits the application of OAMP/VAMP to large-scale systems. To solve the disadvantages of AMP and OAMP/VAMP, this paper proposes a low-complexity memory AMP (MAMP) for unitarily-invariant matrices. MAMP is consisted of an orthogonal non-linear estimator (NLE) for denoising (same as OAMP/VAMP), and an orthogonal long-memory matched filter (MF) for interference suppression. Orthogonal principle is used to guarantee the asymptotic Gaussianity of estimation errors in MAMP. A state evolution is derived to asymptotically characterize the performance of MAMP. The relaxation parameters and damping vector in MAMP are analytically optimized based on the state evolution to guarantee and improve the convergence. MAMP has comparable complexity to AMP. Furthermore, for all unitarily-invariant matrices, the optimized MAMP converges to the high-complexity OAMP/VAMP, and thus is Bayes-optimal if it has a unique fixed point. Finally, simulations are provided to verify the validity and accuracy of the theoretical results.

0
0
下载
预览
参考链接
父主题
Top