当前位置: X-MOL 学术Neural Process Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
CT-UNet: Context-Transfer-UNet for Building Segmentation in Remote Sensing Images
Neural Processing Letters ( IF 2.6 ) Pub Date : 2021-08-02 , DOI: 10.1007/s11063-021-10592-w
Sheng Liu 1 , Huanran Ye 1 , Kun Jin 1 , Haohao Cheng 1
Affiliation  

With the proliferation of remote sensing images, how to segment buildings more accurately in remote sensing images is a critical challenge. First, most networks have poor recognition ability on high resolution images, resulting in blurred boundaries in the segmented building maps. Second, the similarity between buildings and background results in intra-class inconsistency. To address these two problems, we propose an UNet-based network named Context-Transfer-UNet (CT-UNet). Specifically, we design Dense Boundary Block. Dense Block utilizes reuse mechanism to refine features and increase recognition capabilities. Boundary Block introduces the low-level spatial information to solve the fuzzy boundary problem. Then, to handle intra-class inconsistency, we construct Spatial Channel Attention Block. It combines context space information and selects more distinguishable features from space and channel. Finally, we propose an improved loss function to enhance the purpose of loss by adding evaluation indicator. Based on our proposed CT-UNet, we achieve 85.33% mean IoU on the Inria dataset, 91.00% mean IoU on the WHU dataset and 83.92% F1-score on the Massachusetts dataset. The results outperform our baseline (U-Net ResNet-34) by 3.76%, exceed Web-Net by 2.24% and surpass HFSA-Unet by 2.17%.



中文翻译:

CT-UNet:用于在遥感图像中建立分割的上下文传输-UNet

随着遥感影像的激增,如何在遥感影像中更准确地分割建筑物是一个严峻的挑战。首先,大多数网络对高分辨率图像的识别能力较差,导致分割后的建筑图边界模糊。其次,建筑物和背景之间的相似性导致类内不一致。为了解决这两个问题,我们提出了一个名为 Context-Transfer-UNet (CT-UNet) 的基于 UNet 的网络。具体来说,我们设计了密集边界块。Dense Block 利用重用机制来细化特征并提高识别能力。Boundary Block 引入低级空间信息来解决模糊边界问题。然后,为了处理类内不一致,我们构造了空间通道注意块。它结合上下文空间信息,从空间和通道中选择更多可区分的特征。最后,我们提出了一种改进的损失函数,通过添加评估指标来增强损失的目的。基于我们提出的 CT-UNet,我们在 Inria 数据集上实现了 85.33% 的平均 IoU,在 WHU 数据集上实现了 91.00% 的平均 IoU,在马萨诸塞州数据集上实现了 83.92% 的 F1-score。结果超过我们的基线(U-Net ResNet-34)3.76%,超过Web-Net 2.24%,超过HFSA-Unet 2.17%。

更新日期:2021-08-02
down
wechat
bug