This paper investigates the classical integer least-squares problem which estimates integer signals from linear models. The problem is NP-hard and often arises in diverse applications such as signal processing, bioinformatics, communications and machine learning, to name a few. Since the existing optimal search strategies involve prohibitive complexities, they are hard to be adopted in large-scale problems. To address this issue, we propose a general hyper-accelerated tree search (HATS) algorithm by employing a deep neural network to estimate the optimal heuristic for the underlying simplified memory-bounded A* algorithm, and the proposed algorithm can be easily generalized with other heuristic search algorithms. Inspired by the temporal difference learning, we further propose a training strategy which enables the network to approach the optimal heuristic precisely and consistently, thus the proposed algorithm can reach nearly the optimal efficiency when the estimation error is small enough. Experiments show that the proposed algorithm can reach almost the optimal maximum likelihood estimate performance in large-scale problems, with a very low complexity in both time and space. The code of this paper is avaliable at https://github.com/skypitcher/hats.
翻译:本文调查了古典整形最小平方问题,它估计了线性模型的整形信号。 问题在于NP- 硬性,常常出现在信号处理、生物信息学、通信和机器学习等多种应用中。 由于现有的最佳搜索战略涉及令人望而却步的复杂因素,因此很难在大规模问题中采用。 为了解决这个问题,我们提议采用一个总的超高加速树搜索算法,采用一个深层神经网络来估计基本的简化记忆限制A* 算法的最佳超常性能,而且拟议的算法很容易与其他超常搜索算法相普及。在时间差异学习的启发下,我们进一步提议了一项培训战略,使网络能够准确和一贯地接近最佳的超常性,因此,在估计错误足够小的时候,拟议的算法几乎可以达到最佳的效率。 实验表明,提议的算法在大规模问题中几乎可以达到最高的可能性估计性,在时间和空间上都非常低的复杂程度。 本文的代码在 https://github.com/skypricher/hats。