当前位置: X-MOL 学术arXiv.cs.AI › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Model-Attentive Ensemble Learning for Sequence Modeling
arXiv - CS - Artificial Intelligence Pub Date : 2021-02-23 , DOI: arxiv-2102.11500
Victor D. Bourgin, Ioana Bica, Mihaela van der Schaar

Medical time-series datasets have unique characteristics that make prediction tasks challenging. Most notably, patient trajectories often contain longitudinal variations in their input-output relationships, generally referred to as temporal conditional shift. Designing sequence models capable of adapting to such time-varying distributions remains a prevailing problem. To address this we present Model-Attentive Ensemble learning for Sequence modeling (MAES). MAES is a mixture of time-series experts which leverages an attention-based gating mechanism to specialize the experts on different sequence dynamics and adaptively weight their predictions. We demonstrate that MAES significantly out-performs popular sequence models on datasets subject to temporal shift.

中文翻译:

序列建模的模型专心集成学习

医学时间序列数据集具有独特的特征,这些特征使预测任务具有挑战性。最值得注意的是,患者轨迹通常在其输入-输出关系中包含纵向变化,通常称为时间条件转移。设计能够适应这种随时间变化的分布的序列模型仍然是一个普遍的问题。为了解决这个问题,我们提出了用于序列建模(MAES)的模型专注合奏学习。MAES是时间序列专家的混合物,它利用基于注意的门控机制来专门研究不同序列动力学的专家,并自适应地加权其预测。我们证明,MAES在表现出随时间变化的数据集上的性能明显优于流行的序列模型。
更新日期:2021-02-24
down
wechat
bug