The expressive and computationally inexpensive bipartite Graph Neural Networks (GNN) have been shown to be an important component of deep learning based Mixed-Integer Linear Program (MILP) solvers. Recent works have demonstrated the effectiveness of such GNNs in replacing the branching (variable selection) heuristic in branch-and-bound (B&B) solvers. These GNNs are trained, offline and on a collection of MILPs, to imitate a very good but computationally expensive branching heuristic, strong branching. Given that B&B results in a tree of sub-MILPs, we ask (a) whether there are strong dependencies exhibited by the target heuristic among the neighboring nodes of the B&B tree, and (b) if so, whether we can incorporate them in our training procedure. Specifically, we find that with the strong branching heuristic, a child node's best choice was often the parent's second-best choice. We call this the "lookback" phenomenon. Surprisingly, the typical branching GNN of Gasse et al. (2019) often misses this simple "answer". To imitate the target behavior more closely by incorporating the lookback phenomenon in GNNs, we propose two methods: (a) target smoothing for the standard cross-entropy loss function, and (b) adding a Parent-as-Target (PAT) Lookback regularizer term. Finally, we propose a model selection framework to incorporate harder-to-formulate objectives such as solving time in the final models. Through extensive experimentation on standard benchmark instances, we show that our proposal results in up to 22% decrease in the size of the B&B tree and up to 15% improvement in the solving times.
翻译:显性且计算成本低廉的双面图形神经网络( GNN) 已被证明是深深学习的混合 Interger 线性程序( MILP) 解决方案的重要组成部分。 最近的工作已经证明这些GNN在替换分支和离线( B&B) 解决方案中分支( 可变选择) 的杂交。 这些 GNN是经过培训的, 离线的, 以及集成的 MILP 组合, 以模仿一个非常好但计算成本昂贵的分支超常、 强的分支。 鉴于 B&B在亚双向 MILP 树树上的结果, 我们问 (a) 这些G&B树的相邻节性点之间是否有很强的关联性( 可变式选择) 替换分支( 可变式选择) 。 具体地说, 我们发现, 随着强大的分流性, 儿童节制, 最优的选择往往是母体的第二最优框架。 我们称之为“ 向回看” 现象。 令人惊讶的是, 直观的BAT 目标在常规定义中往往将GNNS 格式显示“ 格式中的简单的版本” 格式 。