当前位置: X-MOL 学术IEEE Geosci. Remote Sens. Mag. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
SSL4EO-S12: A large-scale multimodal, multitemporal dataset for self-supervised learning in Earth observation [Software and Data Sets]
IEEE Geoscience and Remote Sensing Magazine ( IF 14.6 ) Pub Date : 2023-09-25 , DOI: 10.1109/mgrs.2023.3281651
Yi Wang 1 , Nassim Ait Ali Braham 1 , Zhitong Xiong 1 , Chenying Liu 1 , Conrad M. Albrecht 2 , Xiao Xiang Zhu 1
Affiliation  

Self-supervised pretraining bears the potential to generate expressive representations from large-scale Earth observation (EO) data without human annotation. However, most existing pretraining in the field is based on ImageNet or medium-sized, labeled remote sensing (RS) datasets. In this article, we share an unlabeled dataset Self-Supervised Learning for Earth Observation-Sentinel-1/2 ( SSL4EO - S12 ) to assemble a large-scale, global, multimodal, and multiseasonal corpus of satellite imagery. We demonstrate SSL4EO-S12 to succeed in self-supervised pretraining for a set of representative methods: momentum contrast (MoCo), self-distillation with no labels (DINO), masked autoencoders (MAE), and data2vec, and multiple downstream applications, including scene classification, semantic segmentation, and change detection. Our benchmark results prove the effectiveness of SSL4EO-S12 compared to existing datasets. The dataset, related source code, and pretrained models are available at https://github.com/zhu-xlab/SSL4EO-S12 .

中文翻译:

SSL4EO-S12:用于地球观测中自我监督学习的大规模多模式、多时态数据集[软件和数据集]

自监督预训练具有从大规模地球观测(EO)数据生成表达表示的潜力,无需人工注释。然而,该领域现有的大多数预训练都是基于 ImageNet 或中等规模的标记遥感 (RS) 数据集。在本文中,我们分享一个未标记的数据集地球观测自我监督学习-Sentinel-1/2 ( SSL4EO- S12)组装大规模、全球、多模式、多季节的卫星图像语料库。我们证明 SSL4EO-S12 成功地对一组代表性方法进行了自监督预训练:动量对比 (MoCo)、无标签自蒸馏 (DINO)、掩码自动编码器 (MAE) 和 data2vec,以及多个下游应用程序,包括场景分类、语义分割和变化检测。与现有数据集相比,我们的基准测试结果证明了 SSL4EO-S12 的有效性。数据集、相关源代码和预训练模型可在以下位置获取:https://github.com/zhu-xlab/SSL4EO-S12
更新日期:2023-09-25
down
wechat
bug