当前位置: X-MOL 学术Gastrointest. Endosc. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Automatic detection and classification of protruding lesions in wireless capsule endoscopy images based on a deep convolutional neural network.
Gastrointestinal Endoscopy ( IF 7.7 ) Pub Date : 2020-02-19 , DOI: 10.1016/j.gie.2020.01.054
Hiroaki Saito 1 , Tomonori Aoki 2 , Kazuharu Aoyama 3 , Yusuke Kato 3 , Akiyoshi Tsuboi 4 , Atsuo Yamada 2 , Mitsuhiro Fujishiro 5 , Shiro Oka 4 , Soichiro Ishihara 6 , Tomoki Matsuda 1 , Masato Nakahori 1 , Shinji Tanaka 4 , Kazuhiko Koike 2 , Tomohiro Tada 7
Affiliation  

Background and Aims

Protruding lesions of the small bowel vary in wireless capsule endoscopy (WCE) images, and their automatic detection may be difficult. We aimed to develop and test a deep learning–based system to automatically detect protruding lesions of various types in WCE images.

Methods

We trained a deep convolutional neural network (CNN), using 30,584 WCE images of protruding lesions from 292 patients. We evaluated CNN performance by calculating the area under the receiver operating characteristic curve (AUC), sensitivity, and specificity, using an independent set of 17,507 test images from 93 patients, including 7507 images of protruding lesions from 73 patients.

Results

The developed CNN analyzed 17,507 images in 530.462 seconds. The AUC for detection of protruding lesions was 0.911 (95% confidence interval [Cl], 0.9069–0.9155). The sensitivity and specificity of the CNN were 90.7% (95% CI, 90.0%–91.4%) and 79.8% (95% CI, 79.0%–80.6%), respectively, at the optimal cut-off value of 0.317 for probability score. In a subgroup analysis of the category of protruding lesions, the sensitivities were 86.5%, 92.0%, 95.8%, 77.0%, and 94.4% for the detection of polyps, nodules, epithelial tumors, submucosal tumors, and venous structures, respectively. In individual patient analyses (n = 73), the detection rate of protruding lesions was 98.6%.

Conclusion

We developed and tested a new computer-aided system based on a CNN to automatically detect various protruding lesions in WCE images. Patient-level analyses with larger cohorts and efforts to achieve better diagnostic performance are necessary in further studies.



中文翻译:

基于深度卷积神经网络的无线胶囊内窥镜图像中突出病变的自动检测和分类。

背景和目标

小肠的突出病变在无线胶囊内窥镜检查(WCE)图像中有所不同,并且它们的自动检测可能很困难。我们旨在开发和测试基于深度学习的系统,以自动检测WCE图像中各种类型的突出病变。

方法

我们使用来自292位患者的30,584张WCE突出病变的图像训练了深度卷积神经网络(CNN)。我们使用独立的一组来自93例患者的17,507幅测试图像,包括来自73例患者的7507例突出病变的图像,通过计算接收器工作特征曲线(AUC),灵敏度和特异性下的面积来评估CNN性能。

结果

所开发的CNN在530.462秒内分析了17,507张图像。用于检测突出病变的AUC为0.911(95%置信区间[Cl],0.9069-0.9155)。CNN的敏感性和特异性分别为90.7%(95%CI,90.0%–91.4%)和79.8%(95%CI,79.0%–80.6%),概率得分的最佳临界值为0.317。 。在突出病变类别的亚组分析中,检测息肉,结节,上皮肿瘤,粘膜下肿瘤和静脉结构的敏感性分别为86.5%,92.0%,95.8%,77.0%和94.4%。在个别患者分析(n = 73)中,突出病变的检出率为98.6%。

结论

我们开发并测试了一种基于CNN的新型计算机辅助系统,可自动检测WCE图像中的各种突出病变。在进一步的研究中,有必要对具有更大队列的患者水平进行分析,并努力实现更好的诊断性能。

更新日期:2020-02-19
down
wechat
bug