当前位置: X-MOL 学术J. Phys. Complex › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Emergence of transient chaos and intermittency in machine learning
Journal of Physics: Complexity Pub Date : 2021-07-02 , DOI: 10.1088/2632-072x/ac0b00
Ling-Wei Kong 1 , Huawei Fan 2 , Celso Grebogi 3 , Ying-Cheng Lai 1, 4
Affiliation  

An emerging paradigm for predicting the state evolution of chaotic systems is machine learning with reservoir computing, the core of which is a dynamical network of artificial neurons. Through training with measured time series, a reservoir machine can be harnessed to replicate the evolution of the target chaotic system for some amount of time, typically about half dozen Lyapunov times. Recently, we developed a reservoir computing framework with an additional parameter channel for predicting system collapse and chaotic transients associated with crisis. It was found that the crisis point after which transient chaos emerges can be accurately predicted. The idea of adding a parameter channel to reservoir computing has also been used by others to predict bifurcation points and distinct asymptotic behaviors. In this paper, we address three issues associated with machine-generated transient chaos. First, we report the results from a detailed study of the statistical behaviors of transient chaos generated by our parameter-aware reservoir computing machine. When multiple time series from a small number of distinct values of the bifurcation parameter, all in the regime of attracting chaos, are deployed to train the reservoir machine, it can generate the correct dynamical behavior in the regime of transient chaos of the target system in the sense that the basic statistical features of the machine generated transient chaos agree with those of the real system. Second, we demonstrate that our machine learning framework can reproduce intermittency of the target system. Third, we consider a system for which the known methods of sparse optimization fail to predict crisis and demonstrate that our reservoir computing scheme can solve this problem. These findings have potential applications in anticipating system collapse as induced by, e.g., a parameter drift that places the system in a transient regime.



中文翻译:

机器学习中瞬态混乱和间歇性的出现

预测混沌系统状态演化的一种新兴范式是具有储层计算的机器学习,其核心是人工神经元的动态网络。通过使用测量的时间序列进行训练,可以利用水库机器在一段时间内复制目标混沌系统的演化,通常是大约六次李雅普诺夫时间。最近,我们开发了一个带有附加参数通道的储层计算框架,用于预测与危机相关的系统崩溃和混沌瞬变。结果表明,可以准确预测出现瞬态混沌的危机点。将参数通道添加到储层计算的想法也已被其他人用于预测分岔点和不同的渐近行为。在本文中,我们解决了与机器产生的瞬态混乱相关的三个问题。首先,我们报告了对参数感知型储层计算机生成的瞬态混沌统计行为的详细研究结果。当分岔参数的少量不同值的多个时间序列都处于吸引混沌状态时,用于训练储层机器时,它可以在目标系统的瞬态混沌状态下生成正确的动力学行为机器产生的瞬态混沌的基本统计特征与真实系统的基本统计特征一致。其次,我们证明我们的机器学习框架可以重现目标系统的间歇性。第三,我们考虑一个已知的稀疏优化方法无法预测危机的系统,并证明我们的储层计算方案可以解决这个问题。这些发现在预测由例如将系统置于瞬态状态的参数漂移引起的系统崩溃方面具有潜在应用。

更新日期:2021-07-02
down
wechat
bug