当前位置: X-MOL 学术Knowl. Based Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Domain Invariant and Agnostic Adaptation
Knowledge-Based Systems ( IF 8.8 ) Pub Date : 2021-06-11 , DOI: 10.1016/j.knosys.2021.107192
Sentao Chen , Hanrui Wu , Cheng Liu

Domain adaptation addresses the prediction problem in which the source and target data are sampled from different but related probability distributions. The key problem here lies in properly matching the distributions and learning general feature representation for training the prediction model. In this article, we introduce a Domain Invariant and Agnostic Adaptation (DIAA) solution, which matches the source and target joint distributions, and simultaneously aligns the feature and domain label joint distribution to its marginal product. In particular, DIAA matches and aligns the distributions via a feature transformation, and compares the two kinds of distribution disparities uniformly under the Kullback–Leibler (KL) divergence. To approximate the two corresponding KL divergences from observed samples, we derive a linear-regression-like technique that fits linear models to different ratio functions under the quadratic loss. With the estimated KL divergences, learning the DIAA feature transformation is formulated as solving a Grassmannian minimization problem. Experiments on text and image classification tasks with varied nature demonstrate the success of our approach.



中文翻译:

领域不变和不可知的适应

域自适应解决了从不同但相关的概率分布中采样源数据和目标数据的预测问题。这里的关键问题在于正确匹配分布并学习用于训练预测模型的通用特征表示。在本文中,我们介绍了一种域不变和不可知适应 (DIAA) 解决方案,它匹配源和目标联合分布,同时将特征和域标签联合分布与其边际产品对齐。特别是,DIAA 通过特征变换匹配和对齐分布,并在 Kullback-Leibler (KL) 散度下均匀比较两种分布差异。为了从观察到的样本中近似两个相应的 KL 散度,我们推导出一种类似线性回归的技术,该技术在二次损失下将线性模型拟合到不同的比率函数。使用估计的 KL 散度,学习 DIAA 特征转换被公式化为解决 Grassmannian 最小化问题。不同性质的文本和图像分类任务的实验证明了我们方法的成功。

更新日期:2021-06-11
down
wechat
bug