当前位置: X-MOL 学术IEEE Trans. Autom. Sci. Eng. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Densely Connected Neural Network With Unbalanced Discriminant and Category Sensitive Constraints for Polyp Recognition
IEEE Transactions on Automation Science and Engineering ( IF 5.9 ) Pub Date : 9-17-2019 , DOI: 10.1109/tase.2019.2936645
Yixuan Yuan , Wenjian Qin , Bulat Ibragimov , Guanglei Zhang , Bin Han , Max Q.-H. Meng , Lei Xing

Automatic polyp recognition in endoscopic images is challenging because of the low contrast between polyps and the surrounding area, the fuzzy and irregular polyp borders, and varying imaging light conditions. In this article, we propose a novel densely connected convolutional network with “unbalanced discriminant (UD)” loss and “category sensitive (CS)” loss (DenseNet-UDCS) for the task. We first utilize densely connected convolutional network (DenseNet) as the basic framework to conduct end-to-end polyp recognition task. Then, the proposed dual constraints, UD loss and CS loss, are simultaneously incorporated into the DenseNet model to calculate discriminative and suitable image features. The UD loss in our network effectively captures classification errors from both majority and minority categories to deal with the strong data imbalance of polyp images and normal ones. The CS loss imposes the ratio of intraclass and interclass variations in the deep feature learning process to enable features with large interclass variation and small intraclass compactness. With the joint supervision of UD loss and CS loss, a robust DenseNet-UDCS model is trained to recognize polyps from endoscopic images. The experimental results achieved polyp recognition accuracy of 93.19%, showing that the proposed DenseNet-UDCS can accurately characterize the endoscopic images and recognize polyps from the images. In addition, our DenseNet-UDCS model is superior in detection accuracy in comparison with state-of-the-art polyp recognition methods. Note to Practitioners-Wireless capsule endoscopy (WCE) is a crucial diagnostic tool for polyp detection and therapeutic monitoring, thanks to its noninvasive, user-friendly, and nonpainful properties. A challenge in harnessing the enormous potential of the WCE to benefit the gastrointestinal (GI) patients is that it requires clinicians to analyze a huge number of images (about 50 000 images for each patient). We propose a novel automatic polyp recognition scheme, namely, DenseNet-UDCS model, by addressing practical image unbalanced problem and small interclass variances and large intraclass differences in the data set. The comprehensive experimental results demonstrate superior reliability and robustness of the proposed model compared to the other polyp recognition approaches. Our DenseNet-UDCS model can be further applied in the clinical practice to provide valuable diagnosis information for GI disease recognition and precision medicine.

中文翻译:


用于息肉识别的具有不平衡判别式和类别敏感约束的密集连接神经网络



由于息肉与周围区域之间的对比度低、息肉边界模糊且不规则以及成像光线条件变化,内窥镜图像中的自动息肉识别具有挑战性。在本文中,我们针对该任务提出了一种新颖的密集连接卷积网络,具有“不平衡判别(UD)”损失和“类别敏感(CS)”损失(DenseNet-UDCS)。我们首先利用密集连接的卷积网络(DenseNet)作为基本框架来进行端到端的息肉识别任务。然后,将所提出的双重约束(UD 损失和 CS 损失)同时纳入 DenseNet 模型中,以计算有判别性且合适的图像特征。我们网络中的 UD 损失有效地捕获了多数类别和少数类别的分类错误,以处理息肉图像和正常图像的严重数据不平衡问题。 CS损失在深度特征学习过程中强加了类内和类间变化的比率,以使特征具有大的类间变化和小的类内紧凑性。通过 UD 损失和 CS 损失的联合监督,训练鲁棒的 DenseNet-UDCS 模型来识别内窥镜图像中的息肉。实验结果实现了93.19%的息肉识别准确率,表明所提出的DenseNet-UDCS能够准确地表征内窥镜图像并从图像中识别息肉。此外,与最先进的息肉识别方法相比,我们的 DenseNet-UDCS 模型的检测精度更高。从业者须知——无线胶囊内窥镜 (WCE) 因其无创、用户友好且无痛的特性,成为息肉检测和治疗监测的重要诊断工具。 利用 WCE 的巨大潜力造福胃肠道 (GI) 患者的一个挑战是,它需要临床医生分析大量图像(每位患者约 50,000 张图像)。通过解决实际图像不平衡问题以及数据集中较小的类间方差和较大的类内差异,我们提出了一种新颖的自动息肉识别方案,即 DenseNet-UDCS 模型。综合实验结果表明,与其他息肉识别方法相比,该模型具有卓越的可靠性和鲁棒性。我们的DenseNet-UDCS模型可以进一步应用于临床实践,为胃肠道疾病识别和精准医疗提供有价值的诊断信息。
更新日期:2024-08-22
down
wechat
bug