当前位置: X-MOL 学术IEEE Trans. Signal Process. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Fixed-Point Minimum Error Entropy with Fiducial Points
IEEE Transactions on Signal Processing ( IF 4.6 ) Pub Date : 2020-01-01 , DOI: 10.1109/tsp.2020.3001404
Yuqing Xie , Yingsong Li , Yuantao Gu , Jiuwen Cao , Badong Chen

Compared with traditional learning criteria, such as minimum mean square error (MMSE), the minimum error entropy (MEE) criterion has received increasing attention in the domains of nonlinear and non-Gaussian signal processing and machine learning. Since the MEE criterion is shift-invariant, one has to add a bias to achieve zero-mean error over training datasets. Thus, a modification of the MEE called minimization of error entropy with fiducial points (MEEF) was proposed, which controls the bias for MEE in a more elegant and efficient way. In the present paper, we propose a fixed-point minimization of error entropy with fiducial points (MEEF-FP) as an alternative to the gradient based MEEF for training a linear-in-parameters (LIP) model because of its fast convergence speed, robustness and step-size free. Also, we provide a sufficient condition that guarantees the convergence of the MEEF-FP algorithm. Moreover, we develop a recursive MEEF-FP (RMEEF-FP) for online adaptive learning with low-complexity. Finally, illustrative examples are presented to show the excellent performance of the new methods.

中文翻译:

具有基准点的定点最小误差熵

与最小均方误差 (MMSE) 等传统学习准则相比,最小误差熵 (MEE) 准则在非线性和非高斯信号处理和机器学习领域受到越来越多的关注。由于 MEE 标准是平移不变的,因此必须添加偏差以在训练数据集上实现零均值误差。因此,提出了 MEE 的一种修改,称为最小化基准点误差熵 (MEEF),它以更优雅和有效的方式控制 MEE 的偏差。在本文中,我们提出了一种带有基准点的误差熵的定点最小化(MEEF-FP)作为基于梯度的 MEEF 的替代方法,用于训练线性参数(LIP)模型,因为它具有快速的收敛速度,鲁棒性和无步长。还,我们提供了保证 MEEF-FP 算法收敛的充分条件。此外,我们开发了一种递归 MEEF-FP(RMEEF-FP),用于低复杂度的在线自适应学习。最后,给出了说明性示例以展示新方法的优异性能。
更新日期:2020-01-01
down
wechat
bug