当前位置: X-MOL 学术IEEE Signal Process. Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Batching Soft IoU for Training Semantic Segmentation Networks
IEEE Signal Processing Letters ( IF 3.2 ) Pub Date : 2020-01-01 , DOI: 10.1109/lsp.2019.2956367
Yifeng Huang , Zhirong Tang , Dan Chen , Kaixiong Su , Chengbin Chen

The majority of semantic segmentation networks generally employ cross-entropy as a loss function and intersection-over-union (IoU) as the evaluation metric for network performance. Employing IoU as a loss function can solve the mismatch issue between the loss function and the evaluation metric. We propose a Soft IoU training strategy based on mini-batch (mini-batch Soft IoU). Our work has two primary contributions: The first is to extend the IoU loss function to a multi-class segmentation network. The second is to collect various categories of the training samples in every mini-batch, which will ensure that the number of categories equals to the batch size at least. Our method breaks the randomness of the original mini-batch gradient descent (GD) strategy, advancing training samples in the mini-batch much more consistent with the distribution characteristics of the overall data. It solves the instability of IoU loss function. In addition, the experimental results on the PASCAL VOC2012 dataset reveal that our method effectively improves the segmentation accuracy of the network and attains significant improvements beyond state-of-the-art IoU loss function methods.

中文翻译:

用于训练语义分割网络的批处理软 IoU

大多数语义分割网络通常采用交叉熵作为损失函数,并采用交集交叉(IoU)作为网络性能的评估指标。使用 IoU 作为损失函数可以解决损失函数和评估指标之间的不匹配问题。我们提出了一种基于小批量(mini-batch Soft IoU)的Soft IoU训练策略。我们的工作有两个主要贡献:第一个是将 IoU 损失函数扩展到多类分割网络。二是在每个mini-batch中收集各种类别的训练样本,这将确保类别数至少等于batch size。我们的方法打破了原始小批量梯度下降 (GD) 策略的随机性,在 mini-batch 中推进训练样本更符合整体数据的分布特征。它解决了 IoU 损失函数的不稳定性。此外,在 PASCAL VOC2012 数据集上的实验结果表明,我们的方法有效地提高了网络的分割精度,并取得了超越最先进的 IoU 损失函数方法的显着改进。
更新日期:2020-01-01
down
wechat
bug