当前位置: X-MOL 学术Neurocomputing › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Balancing computational complexity and generalization ability: a novel design for ELM
Neurocomputing ( IF 5.5 ) Pub Date : 2020-08-01 , DOI: 10.1016/j.neucom.2020.03.046
Edoardo Ragusa , Paolo Gastaldo , Rodolfo Zunino , Erik Cambria

Abstract Learning paradigms that use random basis functions provide effective tools to deal with large datasets, as they combine efficient training algorithms with remarkable generalization performances. The paper first considers the affinity between the paradigm of learning with similarity functions and the Extreme Learning Machine (ELM) model, and reformulates the mapping scheme of ELMs. A mapping scheme that better balances generalization ability and network size is a novelty point of the proposed approach, and represent a major advantage when targeting implementation on resource-constrained devices. A computationally efficient heuristic supports the training procedure, and suitably applies the theory of learning with similarity functions to the availability of consistent amounts of data. Experimental results on standard datasets confirm the effectiveness of the proposed approach.

中文翻译:

平衡计算复杂性和泛化能力:ELM 的一种新颖设计

摘要 使用随机基函数的学习范式提供了处理大型数据集的有效工具,因为它们结合了高效的训练算法和卓越的泛化性能。本文首先考虑了相似函数学习范式与极限学习机(ELM)模型之间的亲和性,并重新制定了 ELM 的映射方案。更好地平衡泛化能力和网络规模的映射方案是所提出方法的一个新颖点,并且在针对资源受限设备上的实现时代表了主要优势。计算高效的启发式方法支持训练过程,并适当地将具有相似性函数的学习理论应用于一致数量的数据的可用性。
更新日期:2020-08-01
down
wechat
bug