Recent deep models for solving routing problems always assume a single distribution of nodes for training, which severely impairs their cross-distribution generalization ability. In this paper, we exploit group distributionally robust optimization (group DRO) to tackle this issue, where we jointly optimize the weights for different groups of distributions and the parameters for the deep model in an interleaved manner during training. We also design a module based on convolutional neural network, which allows the deep model to learn more informative latent pattern among the nodes. We evaluate the proposed approach on two types of well-known deep models including GCN and POMO. The experimental results on the randomly synthesized instances and the ones from two benchmark dataset (i.e., TSPLib and CVRPLib) demonstrate that our approach could significantly improve the cross-distribution generalization performance over the original models.
翻译:最近解决路由问题的深层模型总是假设单一分布的培训节点,这严重损害了它们交叉分布的普及能力。在本文件中,我们利用群体分布强力优化(小组DRO)来解决这个问题,我们共同优化不同分布组的重量和深层次模型参数,在培训期间以间断方式进行优化。我们还设计了一个基于共生神经网络的模块,使深层模型能够学习各节点之间更具有信息性的潜在模式。我们评估了两种众所周知的深层次模型的拟议方法,包括GCN和POM。随机合成实例的实验结果和两个基准数据集(即TSPLib和CVRPLib)的实验结果表明,我们的方法可以大大改进原始模型的交叉分布通用性表现。