Normalization of polynomials plays a vital role in the approximate basis computation of vanishing ideals. Coefficient normalization, which normalizes a polynomial with its coefficient norm, is the most common method in computer algebra. This study proposes the gradient-weighted normalization method for the approximate border basis computation of vanishing ideals, inspired by recent developments in machine learning. The data-dependent nature of gradient-weighted normalization leads to better stability against perturbation and consistency in the scaling of input points, which cannot be attained by coefficient normalization. Only a subtle change is needed to introduce gradient normalization in the existing algorithms with coefficient normalization. The analysis of algorithms still works with a small modification, and the time complexity of algorithms remains unchanged. We also prove that, with coefficient normalization, which does not provide the scaling consistency property, scaling of points (e.g., as a preprocessing) can cause an approximate basis computation to fail. This study is the first to theoretically highlight the crucial effect of scaling in approximate basis computation and presents the utility of data-dependent normalization.
翻译:在计算消亡理想的大致基准方面,多元值的正常化具有关键作用。系数正常化使多元值与系数规范正常化,是计算机代数中最常见的方法。本研究报告提议,在机器学习的最新发展启发下,对消亡理想的大致边界计算采用梯度加权正常化方法。梯度加权正常化具有依赖数据的性质,因此在计算因系数正常化无法达到的输入点时,在计算振动和一致性方面可以有更好的稳定性。只需要细微的改变,即可在现有的系数正常化算法中引入梯度正常化。对算法的分析仍然稍作修改,而算法的时间复杂性保持不变。我们还证明,如果系数正常化不能提供伸缩一致性的属性(例如,作为预处理),那么点的缩放(作为预处理)可能会导致近似基准计算失败。这一研究从理论上讲,首先强调了缩放近似基准计算的关键效果,并展示了数据正常化的效用。