当前位置: X-MOL 学术Knowl. Inf. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Warped softmax regression for time series classification
Knowledge and Information Systems ( IF 2.7 ) Pub Date : 2021-01-02 , DOI: 10.1007/s10115-020-01533-5
Brijnesh Jain

Linear models are a mainstay in statistical pattern recognition but do not play a role in time series classification, because they fail to account for temporal variations. To overcome this limitation, we combine linear models with dynamic time warping (dtw). We analyze the resulting warped-linear models theoretically and empirically. The three main theoretical results are (i) the Representation Theorem, (ii) the Matrix Complexity Lemma, and (iii) local Lipschitz continuity of the warped softmax function. The Representation Theorem roughly states that warped-linear models correspond to polytope classifiers in Euclidean spaces. This key result is useful because it simplifies analysis of warped-linear models. For example, it provides a geometric interpretation, points to the label dependency problem, and justifies application of warped-linear models not only on temporal but also on multivariate data. The Representation Theorem together with the Matrix Complexity Lemma reveals that warped-linear models implement a weight trick by weight selection and massive weight sharing. Local Lipschitz continuity of warped softmax functions admits a principled training of warped-linear models by stochastic subgradient methods. Empirical results show that replacing the inner product of linear models with a dtw-score substantially improves its predictive performance. The theoretical and empirical contributions of this article provide a simple and efficient first-trial alternative to nearest-neighbor methods and open up new perspectives for more sophisticated classifiers such as warped deep learning.



中文翻译:

扭曲的softmax回归用于时间序列分类

线性模型是统计模式识别的主体,但由于无法考虑时间变化,因此在时间序列分类中不起作用。为了克服此限制,我们将线性模型与动态时间扭曲(dtw)结合在一起。我们从理论和经验上分析所得的线性翘曲模型。三个主要的理论结果是(i)表示定理,(ii)矩阵复杂性引理和(iii)扭曲softmax函数的局部Lipschitz连续性。表示定理大致指出,扭曲线性模型对应于欧几里得空间中的多义分类器。此关键结果很有用,因为它简化了翘曲线性模型的分析。例如,它提供了一种几何解释,指向标签依赖性问题,并证明了扭曲线性模型不仅适用于时间数据,而且适用于多元数据。表示定理与矩阵复杂度引理一起表明,扭曲线性模型通过权重选择和大规模权重共享实现了权重技巧。扭曲softmax函数的局部Lipschitz连续性允许通过随机次梯度方法对扭曲线性模型进行有原则的训练。实证结果表明,用dtw得分代替线性模型的内积会大大提高其预测性能。本文的理论和经验贡献为最近邻方法提供了一种简单有效的第一审判方法,并为更复杂的分类器(例如扭曲深度学习)开辟了新的视角。表示定理与矩阵复杂度引理一起表明,扭曲线性模型通过权重选择和大规模权重共享实现了权重技巧。扭曲softmax函数的局部Lipschitz连续性允许通过随机次梯度方法对扭曲线性模型进行有原则的训练。实证结果表明,用dtw得分代替线性模型的内积会大大提高其预测性能。本文的理论和经验贡献为最近邻居方法提供了一种简单有效的第一审判方法,并为更复杂的分类器(例如扭曲深度学习)开辟了新的视角。表示定理与矩阵复杂度引理一起表明,扭曲线性模型通过权重选择和大规模权重共享实现了权重技巧。扭曲softmax函数的局部Lipschitz连续性允许通过随机次梯度方法对扭曲线性模型进行有原则的训练。实证结果表明,用dtw得分代替线性模型的内积会大大提高其预测性能。本文的理论和经验贡献为最近邻居方法提供了一种简单有效的第一审判方法,并为更复杂的分类器(例如扭曲深度学习)开辟了新的视角。实证结果表明,用dtw得分代替线性模型的内积会大大提高其预测性能。本文的理论和经验贡献为最近邻居方法提供了一种简单有效的第一审判方法,并为更复杂的分类器(例如扭曲深度学习)开辟了新的视角。实证结果表明,用dtw得分代替线性模型的内积会大大提高其预测性能。本文的理论和经验贡献为最近邻居方法提供了一种简单有效的第一审判方法,并为更复杂的分类器(例如扭曲深度学习)开辟了新的视角。

更新日期:2021-01-02
down
wechat
bug