当前位置: X-MOL 学术J. Am. Stat. Assoc. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Sparse Identification and Estimation of Large-Scale Vector AutoRegressive Moving Averages
Journal of the American Statistical Association ( IF 3.7 ) Pub Date : 2021-08-09 , DOI: 10.1080/01621459.2021.1942013
Ines Wilms 1 , Sumanta Basu 2 , Jacob Bien 3 , David S Matteson 2
Affiliation  

Abstract

The vector autoregressive moving average (VARMA) model is fundamental to the theory of multivariate time series; however, identifiability issues have led practitioners to abandon it in favor of the simpler but more restrictive vector autoregressive (VAR) model. We narrow this gap with a new optimization-based approach to VARMA identification built upon the principle of parsimony. Among all equivalent data-generating models, we use convex optimization to seek the parameterization that is simplest in a certain sense. A user-specified strongly convex penalty is used to measure model simplicity, and that same penalty is then used to define an estimator that can be efficiently computed. We establish consistency of our estimators in a double-asymptotic regime. Our nonasymptotic error bound analysis accommodates both model specification and parameter estimation steps, a feature that is crucial for studying large-scale VARMA algorithms. Our analysis also provides new results on penalized estimation of infinite-order VAR, and elastic net regression under a singular covariance structure of regressors, which may be of independent interest. We illustrate the advantage of our method over VAR alternatives on three real data examples.



中文翻译:

大规模向量自回归移动平均线的稀疏识别和估计

摘要

矢量自回归移动平均 (VARMA) 模型是多元时间序列理论的基础;然而,可识别性问题导致从业者放弃它,转而采用更简单但限制更严格的向量自回归(VAR)模型。我们通过一种基于简约原则的基于优化的 VARMA 识别新方法缩小了这一差距。在所有等效的数据生成模型中,我们使用凸优化来寻求某种意义上最简单的参数化。用户指定的强凸罚分用于衡量模型的简单性,然后使用相同的罚分来定义可以有效计算的估计器。我们在双渐近机制中建立了估计量的一致性。我们的非渐近误差界分析同时支持模型规范和参数估计步骤,这一功能对于研究大规模 VARMA 算法至关重要。我们的分析还提供了无限阶 VAR 的惩罚估计以及回归量奇异协方差结构下的弹性网络回归的新结果,这可能具有独立意义。我们通过三个真实数据示例说明了我们的方法相对于 VAR 替代方法的优势。

更新日期:2021-08-09
down
wechat
bug