In recent years, the approximate basis computation of vanishing ideals has been studied extensively and adopted both in computer algebra and data-driven applications such as machine learning. However, symbolic computation and the dependency on monomial ordering remain as essential gaps between the two above-mentioned fields. In this paper, we propose the first efficient monomial-agnostic approximate basis computation of vanishing ideals, where polynomials are manipulated without any information of monomials; this can be implemented in a fully numerical manner and is thus desirable for data-driven applications. In particular, we propose gradient normalization, which achieves not only the first efficient and monomial-agnostic normalization of polynomials but also provides significant advantages such as consistency in translation and scaling of data points, which cannot be realized by existing basis computation algorithms. During the basis computation, the gradients of polynomials at the given points are proven to be efficiently and exactly obtained without performing differentiation. By exploiting the gradient information, we further propose a basis reduction method to remove redundant polynomials in a monomial-agnostic manner. Finally, we also propose a regularization method using gradients to avoiding overfitting of the basis for the given perturbed points.
翻译:近年来,对消失理想的近似计算基础进行了广泛研究,并在计算机代数和机器学习等数据驱动的应用中广泛采用了消失理想的近似计算基础,在计算机代数和机器学习等数据驱动的应用中均采用了这种计算方法。然而,象征性计算和对单项订购的依赖仍然是上述两个领域之间的基本差距。在本文件中,我们建议采用第一个有效的单项-单项合成近似基准计算消失理想,即多项性在没有任何单项单项信息的情况下被操纵;这可以完全以数字方式实施,因此对于数据驱动的应用也是可取的。我们特别建议了梯度正常化方法,不仅实现了多类动物的第一次高效和单项-单项性正常化,而且还提供了重要的优势,例如数据点的翻译和缩放的一致性,而现有的基础计算算法无法实现这一点。在进行基础计算时,在给定点上的多项性计算梯度梯度的梯度被证明是有效的和准确获得的,而没有进行区分。我们通过利用梯度信息,进一步提出一个基础削减方法,以便以单项-定性方式消除多余的多项性多项多数值。最后,我们还提议采用一种正规化方法,以避免给定定基。