当前位置: X-MOL 学术Soft Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Approximation of two-variable functions using high-order Takagi–Sugeno fuzzy systems, sparse regressions, and metaheuristic optimization
Soft Computing ( IF 3.1 ) Pub Date : 2020-09-05 , DOI: 10.1007/s00500-020-05238-3
Krzysztof Wiktorowicz , Tomasz Krzeszowski

This paper proposes a new hybrid method for training high-order Takagi–Sugeno fuzzy systems using sparse regressions and metaheuristic optimization. The fuzzy system is considered with Gaussian fuzzy sets in the antecedents and high-order polynomials in the consequents of fuzzy rules. The fuzzy sets can be chosen manually or determined by a metaheuristic optimization method (particle swarm optimization, genetic algorithm or simulated annealing), while the polynomials are obtained using ordinary least squares, ridge regression or sparse regressions (forward selection, least angle regression, least absolute shrinkage and selection operator, and elastic net regression). A quality criterion is proposed that expresses a compromise between the prediction ability of the fuzzy model and its sparsity. The conducted experiments showed that: (a) the use of sparse regressions and/or metaheuristic optimization can reduce the validation error compared with the reference method, and (b) the use of sparse regressions may simplify the fuzzy model by zeroing some of the coefficients.



中文翻译:

使用高阶Takagi–Sugeno模糊系统,稀疏回归和元启发式优化对二元函数进行逼近

本文提出了一种使用稀疏回归和元启发式优化训练高阶Takagi-Sugeno模糊系统的新混合方法。在模糊规则的结果中,将模糊系统与先验的高斯模糊集和高阶多项式一起考虑。可以手动选择模糊集,也可以通过元启发式优化方法(粒子群优化,遗传算法或模拟退火)确定模糊集,而多项式则使用普通最小二乘法,岭回归或稀疏回归(正向选择,最小角度回归,最小绝对收缩和选择算子,以及弹性净回归)。提出了一种质量标准,该标准表达了模糊模型的预测能力与其稀疏性之间的折衷。进行的实验表明:

更新日期:2020-09-07
down
wechat
bug