当前位置: X-MOL 学术npj Quantum Inform. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Optimizing a polynomial function on a quantum processor
npj Quantum Information ( IF 6.6 ) Pub Date : 2021-01-29 , DOI: 10.1038/s41534-020-00351-5
Keren Li , Shijie Wei , Pan Gao , Feihao Zhang , Zengrong Zhou , Tao Xin , Xiaoting Wang , Patrick Rebentrost , Guilu Long

The gradient descent method is central to numerical optimization and is the key ingredient in many machine learning algorithms. It promises to find a local minimum of a function by iteratively moving along the direction of the steepest descent. Since for high-dimensional problems the required computational resources can be prohibitive, it is desirable to investigate quantum versions of the gradient descent, such as the recently proposed (Rebentrost et al.1). Here, we develop this protocol and implement it on a quantum processor with limited resources. A prototypical experiment is shown with a four-qubit nuclear magnetic resonance quantum processor, which demonstrates the iterative optimization process. Experimentally, the final point converged to the local minimum with a fidelity >94%, quantified via full-state tomography. Moreover, our method can be employed to a multidimensional scaling problem, showing the potential to outperform its classical counterparts. Considering the ongoing efforts in quantum information and data science, our work may provide a faster approach to solving high-dimensional optimization problems and a subroutine for future practical quantum computers.



中文翻译:

在量子处理器上优化多项式函数

梯度下降法是数值优化的核心,并且是许多机器学习算法中的关键要素。它有望通过沿最陡下降方向反复移动来找到函数的局部最小值。由于对于高维问题,所需的计算资源可能是禁止的,因此希望研究梯度下降的量子形式,例如最近提出的(Rebentrost等人1)。在这里,我们开发此协议并将其在资源有限的量子处理器上实现。显示了具有四量子位核磁共振量子处理器的原型实验,该实验演示了迭代优化过程。在实验上,最终点以全状态断层扫描量化的保真度> 94%收敛到局部最小值。此外,我们的方法可用于多维比例尺缩放问题,显示出超越其经典对应方法的潜力。考虑到量子信息和数据科学领域的不断努力,我们的工作可能会为解决高维优化问题提供更快的方法,并为将来的实用量子计算机提供子程序。

更新日期:2021-01-29
down
wechat
bug