当前位置: X-MOL 学术arXiv.cs.LG › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Sliced Kernelized Stein Discrepancy
arXiv - CS - Machine Learning Pub Date : 2020-06-30 , DOI: arxiv-2006.16531
Wenbo Gong, Yingzhen Li, Jos\'e Miguel Hern\'andez-Lobato

Kernelized Stein discrepancy (KSD), though being extensively used in goodness-of-fit tests and model learning, suffers from the curse-of-dimensionality. We address this issue by proposing the sliced Stein discrepancy and its scalable and kernelized variants, which employs kernel-based test functions defined on the optimal onedimensional projections instead of the full input in high dimensions. When applied to goodness-of-fit tests, extensive experiments show the proposed discrepancy significantly outperforms KSD and various baselines in high dimensions. For model learning, we show its advantages by training an independent component analysis when compared with existing Stein discrepancy baselines. We further propose a novel particle inference method called sliced Stein variational gradient descent (S-SVGD) which alleviates the mode-collapse issue of SVGD in training variational autoencoders.

中文翻译:

切片 Kernelized Stein 差异

Kernelized Stein 差异 (KSD) 虽然广泛用于拟合优度测试和模型学习,但受到维数灾难的影响。我们通过提出切片 Stein 差异及其可扩展和内核化的变体来解决这个问题,它采用在最佳一维投影上定义的基于内核的测试函数,而不是高维的完整输入。当应用于拟合优度测试时,大量实验表明,所提出的差异在高维度上明显优于 KSD 和各种基线。对于模型学习,与现有的 Stein 差异基线相比,我们通过训练独立组件分析来展示其优势。
更新日期:2020-07-01
down
wechat
bug