当前位置: X-MOL 学术Stat. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Local dimension reduction of summary statistics for likelihood-free inference
Statistics and Computing ( IF 2.2 ) Pub Date : 2019-10-04 , DOI: 10.1007/s11222-019-09905-w
Jukka Sirén , Samuel Kaski

Approximate Bayesian computation (ABC) and other likelihood-free inference methods have gained popularity in the last decade, as they allow rigorous statistical inference for complex models without analytically tractable likelihood functions. A key component for accurate inference with ABC is the choice of summary statistics, which summarize the information in the data, but at the same time should be low-dimensional for efficiency. Several dimension reduction techniques have been introduced to automatically construct informative and low-dimensional summaries from a possibly large pool of candidate summaries. Projection-based methods, which are based on learning simple functional relationships from the summaries to parameters, are widely used and usually perform well, but might fail when the assumptions behind the transformation are not satisfied. We introduce a localization strategy for any projection-based dimension reduction method, in which the transformation is estimated in the neighborhood of the observed data instead of the whole space. Localization strategies have been suggested before, but the performance of the transformed summaries outside the local neighborhood has not been guaranteed. In our localization approach the transformation is validated and optimized over validation datasets, ensuring reliable performance. We demonstrate the improvement in the estimation accuracy for localized versions of linear regression and partial least squares, for three different models of varying complexity.

中文翻译:

摘要统计量的局部维数减少,以实现无似然推理

在过去的十年中,近似贝叶斯计算(ABC)和其他无可能性的推理方法已得到普及,因为它们允许对复杂模型进行严格的统计推断,而无需分析可处理的似然函数。准确推断ABC的一个关键组成部分是摘要统计的选择,摘要统计可以汇总数据中的信息,但同时为了提高效率,应该使用低维度的统计。已经引入了几种降维技术,以从可能很大的候选摘要池中自动构造信息量少的维度摘要。基于投影的方法(其基于学习从汇总到参数的简单功能关系)被广泛使用并且通常效果良好,但是如果不满足转换背后的假设,则可能会失败。对于任何基于投影的降维方法,我们都引入了一种本地化策略,在该策略中,转换是在观测数据的附近而不是整个空间中估计的。以前曾提出过本地化策略,但不能保证在本地邻域外的转换摘要的性能。在我们的本地化方法中,将通过验证数据集对转换进行验证和优化,以确保性能可靠。对于三种不同复杂度的模型,我们证明了线性回归和偏最小二乘的局部化版本的估计精度的提高。其中,转换是在观测数据附近而不是整个空间中估计的。以前曾提出过本地化策略,但不能保证在本地邻域外的转换摘要的性能。在我们的本地化方法中,将通过验证数据集对转换进行验证和优化,以确保性能可靠。对于三种不同复杂度的模型,我们证明了线性回归和偏最小二乘的局部化版本的估计精度的提高。其中,转换是在观测数据附近而不是整个空间中估计的。以前曾提出过本地化策略,但不能保证在本地邻域外的转换摘要的性能。在我们的本地化方法中,将通过验证数据集对转换进行验证和优化,以确保性能可靠。对于三种不同复杂度的模型,我们证明了线性回归和偏最小二乘的局部化版本的估计精度的提高。确保可靠的性能。对于三种不同复杂度的模型,我们证明了线性回归和偏最小二乘的局部化版本的估计精度的提高。确保可靠的性能。对于三种不同复杂度的模型,我们证明了线性回归和偏最小二乘的局部化版本的估计精度的提高。
更新日期:2019-10-04
down
wechat
bug