当前位置: X-MOL 学术arXiv.cs.SC › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Border Basis Computation with Gradient-Weighted Norm
arXiv - CS - Symbolic Computation Pub Date : 2021-01-02 , DOI: arxiv-2101.00401
Hiroshi Kera

Normalization of polynomials plays an essential role in the approximate basis computation of vanishing ideals. In computer algebra, coefficient normalization, which normalizes a polynomial by its coefficient norm, is the most common method. In this study, we propose gradient-weighted normalization for the approximate border basis computation of vanishing ideals, inspired by the recent results in machine learning. The data-dependent nature of gradient-weighted normalization leads to powerful properties such as better stability against the perturbation and a sort of consistency in the scaling of input points, which cannot be attained by the conventional coefficient normalization. With a slight modification, the analysis of algorithms with coefficient normalization still works with gradient-weighted normalization and the time complexity does not change. We also provide an upper bound of the coefficient norm with respect to the gradient-weighted norm, which allows us to discuss the approximate border bases with gradient-weighted normalization from the perspective of the coefficient norm.

中文翻译:

梯度加权范数的边界基础计算

多项式的归一化在消失的理想的近似基础计算中起着至关重要的作用。在计算机代数中,最常用的方法是通过系数范数对多项式进行归一化的系数归一化。在这项研究中,我们提出了梯度加权归一化方法,以消除消失的理想的近似边界基础计算,这受机器学习的最新结果启发。梯度加权归一化的数据相关性质导致强大的属性,例如更好的抗扰动稳定性以及输入点缩放的某种一致性,这是常规系数归一化无法实现的。稍加修改,系数归一化算法的分析仍可用于梯度加权归一化,并且时间复杂度不会改变。我们还提供了相对于梯度加权范数的系数范数的上限,这使我们能够从系数范数的角度讨论具有梯度加权归一化的近似边界基。
更新日期:2021-01-05
down
wechat
bug