当前位置: X-MOL 学术IEEE Trans. Syst. Man Cybern. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Applying Exponential Family Distribution to Generalized Extreme Learning Machine
IEEE Transactions on Systems, Man, and Cybernetics: Systems ( IF 8.7 ) Pub Date : 2020-05-01 , DOI: 10.1109/tsmc.2017.2788005
Yuheng Jia , Sam Kwong , Ran Wang

The learning algorithm of an extreme learning machine (ELM) has two fundamental steps: 1) random nonlinear feature transformation and 2) least squares learning. Since the probabilistic interpretation for a sample by the least squares method follows a Gaussian distribution, there are two limitations in ELM caused by the second step: 1) it may be inaccurate to handle binary classification problems, since the output of a binary dataset has a distribution far from Gaussian and 2) it may have difficulties in dealing with nontraditional data types (such as count data, ordinal data, etc.), which also do not follow Gaussian distribution. In order to solve the above-mentioned problems, this paper proposes a generalized ELM (GELM) framework by applying the exponential family distribution (EFD) to the output layer node of ELM. It simplifies the design of ELM models for task-specific output domains with different data types. We propose a unified learning paradigm for all the models under this GELM framework with different distributions in EFD, and prove that traditional ELM is a special instance of GELM by setting the output distribution as a Gaussian distribution (GELM-Gaussian). We also prove that the training of GELM-Gaussian can be finished in one iteration, in this case, GELM-Gaussian does not slow down the training speed of traditional ELM. Besides, we propose the kernel version of GELM, which can also be concretized to different models by applying different EFDs. Experimental comparisons demonstrate that GELM can give more accurate probabilistic interpretation to binary classification and GELM has a great potential in dealing with a broader range of machine learning tasks.

中文翻译:

指数族分布在广义极限学习机中的应用

极限学习机(ELM)的学习算法有两个基本步骤:1)随机非线性特征变换和2)最小二乘学习。由于最小二乘法对样本的概率解释遵循高斯分布,因此第二步导致 ELM 存在两个局限性:1)处理二元分类问题可能不准确,因为二元数据集的输出具有2) 处理非传统数据类型(如计数数据、序数数据等)时可能会遇到困难,这些数据类型也不遵循高斯分布。为了解决上述问题,本文通过将指数族分布(EFD)应用于ELM的输出层节点,提出了一种广义ELM(GELM)框架。它简化了具有不同数据类型的特定于任务的输出域的 ELM 模型的设计。我们为这个GELM框架下的所有模型提出了一个统一的学习范式,在EFD中具有不同的分布,并通过将输出分布设置为高斯分布(GELM-Gaussian)来证明传统ELM是GELM的一个特殊实例。我们也证明了 GELM-Gaussian 的训练可以在一次迭代中完成,在这种情况下,GELM-Gaussian 不会减慢传统 ELM 的训练速度。此外,我们提出了 GELM 的内核版本,它也可以通过应用不同的 EFD 来具体化为不同的模型。
更新日期:2020-05-01
down
wechat
bug