当前位置: X-MOL 学术arXiv.cs.IT › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Robust Generalization despite Distribution Shift via Minimum Discriminating Information
arXiv - CS - Information Theory Pub Date : 2021-06-08 , DOI: arxiv-2106.04443
Tobias Sutter, Andreas Krause, Daniel Kuhn

Training models that perform well under distribution shifts is a central challenge in machine learning. In this paper, we introduce a modeling framework where, in addition to training data, we have partial structural knowledge of the shifted test distribution. We employ the principle of minimum discriminating information to embed the available prior knowledge, and use distributionally robust optimization to account for uncertainty due to the limited samples. By leveraging large deviation results, we obtain explicit generalization bounds with respect to the unknown shifted distribution. Lastly, we demonstrate the versatility of our framework by demonstrating it on two rather distinct applications: (1) training classifiers on systematically biased data and (2) off-policy evaluation in Markov Decision Processes.

中文翻译:

尽管通过最小判别信息进行分布转移,但鲁棒性泛化

在分布变化下表现良好的训练模型是机器学习的核心挑战。在本文中,我们介绍了一个建模框架,其中除了训练数据外,我们还拥有移位测试分布的部分结构知识。我们采用最小判别信息的原则来嵌入可用的先验知识,并使用分布稳健的优化来解决由于样本有限而导致的不确定性。通过利用大偏差结果,我们获得了关于未知移位分布的明确泛化边界。最后,我们通过在两个相当不同的应用程序中展示我们的框架的多功能性:(1)在系统性偏见数据上训练分类器和(2)马尔可夫决策过程中的离策略评估。
更新日期:2021-06-09
down
wechat
bug