Normalization of polynomials plays a vital role in the approximate basis computation of vanishing ideals. Coefficient normalization, which normalizes a polynomial with its coefficient norm, is the most common method in computer algebra. This study proposes the gradient-weighted normalization method for the approximate border basis computation of vanishing ideals, inspired by recent developments in machine learning. The data-dependent nature of gradient-weighted normalization leads to better stability against perturbation and consistency in the scaling of input points, which cannot be attained by coefficient normalization. Only a subtle change is needed to introduce gradient normalization in the existing algorithms with coefficient normalization. The analysis of algorithms still works with a small modification, and the order of magnitude of time complexity of algorithms remains unchanged. We also prove that, with coefficient normalization, which does not provide the scaling consistency property, scaling of points (e.g., as a preprocessing) can cause an approximate basis computation to fail. This study is the first to theoretically highlight the crucial effect of scaling in approximate basis computation and presents the utility of data-dependent normalization.
翻译:在计算消亡理想的大致基准方面,多元值的正常化具有关键作用。系数正常化使多元值与系数规范正常化,是计算机代数中最常用的方法。本研究报告提议,在机器学习的最新发展启发下,对消亡理想的大致边界计算采用梯度加权正常化方法。梯度加权正常化具有依赖数据的性质,因此,在通过系数正常化无法达到的输入点的缩放发生扰动和一致性时,有了更好的稳定性。只需要细微的改变,就可以在系数正常化的现有算法中引入梯度正常化。对算法的分析仍然稍作修改,而算法的复杂时间顺序没有变化。我们还证明,由于系数正常化不能提供一致性的属性,点的缩放(例如,作为预处理)可能导致近似基准的计算失败。本研究报告首先从理论上强调缩放近似基准计算的关键效果,并展示数据标准化的效用。