The A* algorithm is commonly used to solve NP-hard combinatorial optimization problems. When provided with a completely informed heuristic function, A* solves many NP-hard minimum-cost path problems in time polynomial in the branching factor and the number of edges in a minimum-cost path. Thus, approximating their completely informed heuristic functions with high precision is NP-hard. We therefore examine recent publications that propose the use of neural networks for this purpose. We support our claim that these approaches do not scale to large instance sizes both theoretically and experimentally. Our first experimental results for three representative NP-hard minimum-cost path problems suggest that using neural networks to approximate completely informed heuristic functions with high precision might result in network sizes that scale exponentially in the instance sizes. The research community might thus benefit from investigating other ways of integrating heuristic search with machine learning.
翻译:A* 算法通常用于解决NP硬组合优化问题。 当提供完全知情的超常功能时, A* 在分流因子和最低成本路径的边缘点的时段多端化过程中解决了许多NP硬最低成本路径问题。 因此, 以高精度近似知情的超常功能是NP硬化的。 因此, 我们检查了提议为此目的使用神经网络的近期出版物。 我们支持我们的说法,即这些方法在理论上和实验上不至于达到大例大小。 我们对三种具有代表性的NP硬最低成本路径问题的第一个实验结果表明,使用神经网络来精确地估计完全知情的超常功能可能会导致网络规模的大幅缩小。 因此,研究界可能会从研究将超常搜索与机器学习相结合的其他方法中受益。