当前位置: X-MOL 学术IEEE Trans. Netural Syst. Rehabil. Eng. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Sparse Ensemble Machine Learning to Improve Robustness of Long-Term Decoding in iBMIs
IEEE Transactions on Neural Systems and Rehabilitation Engineering ( IF 4.9 ) Pub Date : 2019-12-27 , DOI: 10.1109/tnsre.2019.2962708
Shoeb Shaikh , Rosa So , Tafadzwa Sibindi , Camilo Libedinsky , Arindam Basu

This paper presents a novel sparse ensemble based machine learning approach to enhance robustness of intracortical Brain Machine Interfaces (iBMIs) in the face of non-stationary distribution of input neural data across time. Each classifier in the ensemble is trained on a randomly sampled (with replacement) set of input channels. These sparse connections ensure that with a high chance, few of the base classifiers should be less affected by the variations in some of the recording channels. We have tested the generality of this technique on different base classifiers - linear discriminant analysis (LDA), support vector machine (SVM), extreme learning machine (ELM) and multilayer perceptron (MLP). Results show decoding accuracy improvements of up to ≈21 %, 13%, 19%, 10% in non-human primate (NHP) A and 7%, 9%, 7%, 9% in NHP B across test days while using the sparse ensemble approach over a single classifier model for LDA, SVM, ELM and MLP algorithms respectively. Furthermore, improvements of up to ≈7(14)%, 8(15)%, 9(19)%, 7(15)% in NHP A and 8(15)%, 12(20)%, 15(23)%, 12(19)% in NHP B over Random Forest (Long-short Term Memory) have been obtained by sparse ensemble LDA, SVM, ELM, MLP respectively.

中文翻译:

稀疏集成机器学习可提高iBMI中长期解码的鲁棒性

本文提出了一种新的基于稀疏集合的机器学习方法,该方法可在面对输入神经数据跨时间非平稳分布的情况下增强皮质内脑机接口(iBMI)的鲁棒性。集合中的每个分类器都在一组随机采样(带有替换)的输入通道上进行训练。这些稀疏的连接确保极有可能使一些基本分类器受某些记录通道变化的影响较小。我们已经在不同的基础分类器上测试了该技术的通用性-线性判别分析(LDA),支持向量机(SVM),极限学习机(ELM)和多层感知器(MLP)。结果显示,非人类灵长类动物(NHP)A的解码精度提高了≈21%,13%,19%,10%,而7%,9%,7%,在整个测试日中,NHP B占9%,而对于单个LDA,SVM,ELM和MLP算法,在单个分类器模型上使用稀疏集合方法。此外,NHP A的改进最高可达≈7(14)%,8(15)%,9(19)%,7(15)%和8(15)%,12(20)%,15(23)分别通过稀疏集合LDA,SVM,ELM,MLP获得了随机森林(长期短期记忆)中NHP B中的%,12(19)%。
更新日期:2020-03-04
down
wechat
bug