当前位置: X-MOL 学术Phys. Rev. E › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Fluctuation-dissipation-type theorem in stochastic linear learning
Physical Review E ( IF 2.4 ) Pub Date : 2021-09-20 , DOI: 10.1103/physreve.104.034126
Manhyung Han 1 , Jeonghyeok Park 2 , Taewoong Lee 3 , Jung Hoon Han 4
Affiliation  

The fluctuation-dissipation theorem (FDT) is a simple yet powerful consequence of the first-order differential equation governing the dynamics of systems subject simultaneously to dissipative and stochastic forces. The linear learning dynamics, in which the input vector maps to the output vector by a linear matrix whose elements are the subject of learning, has a stochastic version closely mimicking the Langevin dynamics when a full-batch gradient descent scheme is replaced by that of a stochastic gradient descent. We derive a generalized FDT for the stochastic linear learning dynamics and verify its validity among the well-known machine learning data sets such as MNIST, CIFAR-10, and EMNIST.

中文翻译:

随机线性学习中的波动耗散型定理

波动耗散定理 (FDT) 是一阶微分方程的一个简单而强大的结果,该方程控制同时受耗散力和随机力影响的系统的动力学。线性学习动力学,其中输入向量通过一个线性矩阵映射到输出向量,其元素是学习的主题,当全批次梯度下降方案被替换为一个随机版本时,它具有密切模仿朗之万动力学的随机版本。随机梯度下降。我们推导出随机线性学习动力学的广义 FDT,并验证其在 MNIST、CIFAR-10 和 EMNIST 等知名机器学习数据集之间的有效性。
更新日期:2021-09-21
down
wechat
bug