当前位置: X-MOL 学术Inform. Sci. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Improving the Robustness of Recursive Consequent Parameters Learning in Evolving Neuro-Fuzzy Systems
Information Sciences ( IF 8.1 ) Pub Date : 2020-09-23 , DOI: 10.1016/j.ins.2020.09.026
Edwin Lughofer

During the last 15 to 20 years, evolving (neuro-)fuzzy systems (E(N)FS) have enjoyed more and more attraction in the context of data stream mining and modeling processes. This is because they can be updated on the fly in a single-pass sample-wise manner and are able to perform autonomous changes of the models on structural level in order to react onto process drifts. A wide variety of evolving (neuro-)fuzzy systems approaches have been proposed in order to handle data stream mining and modeling processes by dynamically updating the rule structure and antecedents. The current denominator in the update of the consequent (output weight) parameters is the usage of the recursive (fuzzily weighted) least squares estimator (R(FW)LS), as being applied in almost all E(N)FS approaches. In this paper, we propose and examine alternative variants for consequent parameter updates, namely multi-innovation RFWLS, recursive correntropy and especially recursive weighted total least squares (RWTLS). Multi-innovation RFWLS guarantees more stability in the update whenever structural changes (i.e. changes in the antecedents) in the E(N)FS are performed. This is because rule membership degrees are actualized on (a portion of) past samples and properly integrated in each update step. Recursive correntropy addresses the problematic of outliers by down-weighing the influence of higher errors in the parameter updates. Recursive weighted total least squares also takes into account a possible noise level in the input variables (and not solely in the target variable as done in RFWLS). The approaches are compared with standard RFWLS i.) on three data stream regression problems from practical applications, which are affected by noise levels and where one embeds a known drift, and ii.) on a time-series based forecasting problem. The results based on accumulated prediction error trends over time indicate that RFWLS can be largely outperformed by the proposed alternative variants, and this with even lower sensitivity on various data noise levels. So, the proposed variants could be worth of being further considered as promising and serious alternatives.



中文翻译:

在不断发展的神经模糊系统中提高递归结果参数学习的鲁棒性

在过去的15到20年中,不断发展的(神经)模糊系统(E(N)FS)在数据流挖掘和建模过程的背景下越来越受到关注。这是因为它们可以以单次通过的方式实时更新,并且能够在结构级别上对模型进行自主更改,以便对过程漂移做出反应。为了通过动态更新规则结构和先决条件来处理数据流挖掘和建模过程,已经提出了各种各样的进化(神经)模糊系统方法。结果(输出权重)参数更新中的当前分母是递归(模糊加权)最小二乘估计器(R(FW)LS)的使用,几乎应用于所有E(N)FS方法中。在本文中,我们提出并研究了用于后续参数更新的替代变体,即多重创新RFWLS递归性熵,尤其是递归加权总最小二乘(RWTLS)。每当执行E(N)FS中的结构更改(即,先例中的更改)时,多重创新的RFWLS都可以确保更新具有更高的稳定性。这是因为规则成员资格程度是在过去的样本(的一部分)上实现的,并且已在每个更新步骤中正确集成。递归熵通过权衡参数更新中较高误差的影响来解决离群值的问题。递归加权总最小二乘法还考虑了输入变量中可能存在的噪声水平(而不是仅像RFWLS那样在目标变量中存在)。将这些方法与标准RFWLS进行比较。i。)实际应用中的三个数据流回归问题,这些问题受到噪声水平的影响,并且其中嵌入了已知的漂移,并且ii。)基于时间序列的预测问题。基于累积的预测误差趋势随时间变化的结果表明,RFWLS在很大程度上可能会受到建议的替代变体的影响,并且在各种数据噪声水平上的灵敏度甚至更低。因此,建议的变体值得进一步被认为是有前途的和严肃的替代方案。

更新日期:2020-09-23
down
wechat
bug