当前位置: X-MOL 学术arXiv.cs.CV › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Contrastive Learning Meets Transfer Learning: A Case Study In Medical Image Analysis
arXiv - CS - Computer Vision and Pattern Recognition Pub Date : 2021-03-04 , DOI: arxiv-2103.03166
Yuzhe Lu, Aadarsh Jha, Yuankai Huo

Annotated medical images are typically rarer than labeled natural images since they are limited by domain knowledge and privacy constraints. Recent advances in transfer and contrastive learning have provided effective solutions to tackle such issues from different perspectives. The state-of-the-art transfer learning (e.g., Big Transfer (BiT)) and contrastive learning (e.g., Simple Siamese Contrastive Learning (SimSiam)) approaches have been investigated independently, without considering the complementary nature of such techniques. It would be appealing to accelerate contrastive learning with transfer learning, given that slow convergence speed is a critical limitation of modern contrastive learning approaches. In this paper, we investigate the feasibility of aligning BiT with SimSiam. From empirical analyses, different normalization techniques (Group Norm in BiT vs. Batch Norm in SimSiam) are the key hurdle of adapting BiT to SimSiam. When combining BiT with SimSiam, we evaluated the performance of using BiT, SimSiam, and BiT+SimSiam on CIFAR-10 and HAM10000 datasets. The results suggest that the BiT models accelerate the convergence speed of SimSiam. When used together, the model gives superior performance over both of its counterparts. We hope this study will motivate researchers to revisit the task of aggregating big pre-trained models with contrastive learning models for image analysis.

中文翻译:

对比学习遇上转移学习:医学图像分析的案例研究

带注释的医学图像通常比带标签的自然图像稀有,因为它们受到领域知识和隐私约束的限制。迁移和对比学习的最新进展为从不同角度解决此类问题提供了有效的解决方案。在不考虑此类技术的互补性的情况下,已经独立研究了最先进的转移学习(例如,大转移(BiT))和对比学习(例如,简单暹罗对比学习(SimSiam))方法。鉴于缓慢的收敛速度是现代对比学习方法的关键局限性,使用迁移学习来加速对比学习将是有吸引力的。在本文中,我们研究了将BiT与SimSiam对齐的可行性。根据经验分析,不同的规范化技术(BiT中的组规范与SimSiam中的批处理规范)是使BiT适应SimSiam的关键障碍。将BiT与SimSiam结合使用时,我们评估了在CIFAR-10和HAM10000数据集上使用BiT,SimSiam和BiT + SimSiam的性能。结果表明,BiT模型加快了SimSiam的收敛速度。当一起使用时,该模型将提供优于其两个同类产品的性能。我们希望这项研究能激励研究人员重新审视将大型预训练模型与对比学习模型进行图像分析聚合的任务。结果表明,BiT模型加快了SimSiam的收敛速度。当一起使用时,该模型将提供优于其两个同类产品的性能。我们希望这项研究能激励研究人员重新审视将大型预训练模型与对比学习模型进行图像分析聚合的任务。结果表明,BiT模型加快了SimSiam的收敛速度。当一起使用时,该模型将提供优于其两个同类产品的性能。我们希望这项研究能激励研究人员重新审视将大型预训练模型与对比学习模型进行图像分析聚合的任务。
更新日期:2021-03-05
down
wechat
bug