We use ansatz neural network models to predict key metrics of complexity for Gr\"obner bases of binomial ideals. This work illustrates why predictions with neural networks from Gr\"obner computations are not a straightforward process. Using two probabilistic models for random binomial ideals, we generate and make available a large data set that is able to capture sufficient variability in Gr\"obner complexity. We use this data to train neural networks and predict the cardinality of a reduced Gr\"obner basis and the maximum total degree of its elements. While the cardinality prediction problem is unlike classical problems tackled by machine learning, our simulations show that neural networks, providing performance statistics such as $r^2 = 0.401$, outperform naive guess or multiple regression models with $r^2 = 0.180$.
翻译:我们使用 ansatz 神经网络模型来预测 Gr\\ “bobner ” 二元理想基础的关键复杂度。 这项工作可以说明为什么用 Gr\ “ obner 计算 ” 的神经网络预测不是一个简单的过程。 我们使用两种随机二元理想的概率模型, 生成并提供能够捕捉到 \ “ obner ” 复杂度的充分变异性的大型数据集。 我们用这些数据来培训神经网络, 并预测 降低 Gr\ “ obner 基础 ” 及其元素最大总程度的基点。 虽然 基本性预测问题与机器学习所处理的古老问题不同, 但是我们的模拟显示, 神经网络以 $2 = 0. 401美元、 外形天真猜测或多重回归模型提供性能统计数据, 以 $2 = 0. 180美元 。