当前位置: X-MOL 学术arXiv.cs.SC › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Learning a performance metric of Buchberger's algorithm
arXiv - CS - Symbolic Computation Pub Date : 2021-06-07 , DOI: arxiv-2106.03676
Jelena Mojsilović, Dylan Peifer, Sonja Petrović

What can be (machine) learned about the complexity of Buchberger's algorithm? Given a system of polynomials, Buchberger's algorithm computes a Gr\"obner basis of the ideal these polynomials generate using an iterative procedure based on multivariate long division. The runtime of each step of the algorithm is typically dominated by a series of polynomial additions, and the total number of these additions is a hardware independent performance metric that is often used to evaluate and optimize various implementation choices. In this work we attempt to predict, using just the starting input, the number of polynomial additions that take place during one run of Buchberger's algorithm. Good predictions are useful for quickly estimating difficulty and understanding what features make Gr\"obner basis computation hard. Our features and methods could also be used for value models in the reinforcement learning approach to optimize Buchberger's algorithm introduced in [Peifer, Stillman, and Halpern-Leistner, 2020]. We show that a multiple linear regression model built from a set of easy-to-compute ideal generator statistics can predict the number of polynomial additions somewhat well, better than an uninformed model, and better than regression models built on some intuitive commutative algebra invariants that are more difficult to compute. We also train a simple recursive neural network that outperforms these linear models. Our work serves as a proof of concept, demonstrating that predicting the number of polynomial additions in Buchberger's algorithm is a feasible problem from the point of view of machine learning.

中文翻译:

学习 Buchberger 算法的性能指标

关于 Buchberger 算法的复杂性,可以(机器)了解到什么?给定一个多项式系统,Buchberger 的算法使用基于多元长除法的迭代过程计算这些多项式生成的理想的 Gr\"obner 基。算法的每个步骤的运行时间通常由一系列多项式加法决定,并且这些加法的总数是一个与硬件无关的性能指标,通常用于评估和优化各种实现选择。在这项工作中,我们试图仅使用起始输入来预测在一次运行期间发生的多项式加法的数量Buchberger 的算法。好的预测对于快速估计难度和理解哪些特征使 Gr\"obner 基础计算变得困难非常有用。我们的特征和方法还可用于强化学习方法中的价值模型,以优化 [Peifer、Stillman 和 Halpern-Leistner,2020] 中引入的 Buchberger 算法。我们表明,从一组易于计算的理想生成器统计数据构建的多元线性回归模型可以较好地预测多项式加法的数量,比无信息模型更好,并且比建立在一些直观的交换代数不变量上的回归模型更好更难计算。我们还训练了一个优于这些线性模型的简单递归神经网络。我们的工作作为概念证明,证明从机器学习的角度来看,预测 Buchberger 算法中多项式加法的数量是一个可行的问题。
更新日期:2021-06-08
down
wechat
bug