当前位置: X-MOL 学术arXiv.cs.CC › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Lower Bounds on the Total Variation Distance Between Mixtures of Two Gaussians
arXiv - CS - Computational Complexity Pub Date : 2021-09-02 , DOI: arxiv-2109.01064
Sami Davies, Arya Mazumdar, Soumyabrata Pal, Cyrus Rashtchian

Mixtures of high dimensional Gaussian distributions have been studied extensively in statistics and learning theory. While the total variation distance appears naturally in the sample complexity of distribution learning, it is analytically difficult to obtain tight lower bounds for mixtures. Exploiting a connection between total variation distance and the characteristic function of the mixture, we provide fairly tight functional approximations. This enables us to derive new lower bounds on the total variation distance between pairs of two-component Gaussian mixtures that have a shared covariance matrix.

中文翻译:

两个高斯混合的总变异距离的下界

高维高斯分布的混合在统计学和学习理论中得到了广泛的研究。虽然总变异距离自然出现在分布学习的样本复杂度中,但在分析上很难获得混合物的严格下限。利用总变异距离和混合特征函数之间的联系,我们提供了相当紧密的函数近似。这使我们能够推导出具有共享协方差矩阵的双分量高斯混合对之间的总变化距离的新下限。
更新日期:2021-09-03
down
wechat
bug