Noise is a part of data whether the data is from measurement, experiment or ... A few techniques are suggested for noise reduction to improve the data quality in recent years some of which are based on wavelet, orthogonalization and neural networks. The computational cost of existing methods are more than expected and that's why their application in some cases is not beneficial. In this paper, we suggest a low cost techniques based on special linear algebra structures (tridiagonal systems) to improve the signal quality. In this method, we suggest a tridiagonal model for the noise around the most noisy elements. To update the predicted noise, the algorithm is equipped with a learning/feedback approach. The details are described below and based on presented numerical results this algorithm is successful in computing the noise with lower MSE (mean squared error) in computation time specially when the data size is lower than 5000. Our algorithm is used for low-range noise while for high-range noise it is sufficient to use the presented algorithm in hybrid with moving average. The algorithm is implemented in MATLAB 2019b on a computer with Windows 11 having 8GB RAM. It is then tested over many randomly generated experiments. The numerical results confirm the efficiency of presented algorithm in most cases in comparison with existing methods.
翻译:暂无翻译