当前位置: X-MOL 学术J. Biophotonics › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A deep-learning-based approach for noise reduction in high-speed optical coherence Doppler tomography.
Journal of Biophotonics ( IF 2.8 ) Pub Date : 2020-07-10 , DOI: 10.1002/jbio.202000084
Ang Li 1 , Congwu Du 1 , Nora D Volkow 2 , Yingtian Pan 1
Affiliation  

Optical coherence Doppler tomography (ODT) increasingly attracts attention because of its unprecedented advantages with respect to high contrast, capillary‐level resolution and flow speed quantification. However, the trade‐off between the signal‐to‐noise ratio of ODT images and A‐scan sampling density significantly slows down the imaging speed, constraining its clinical applications. To accelerate ODT imaging, a deep‐learning‐based approach is proposed to suppress the overwhelming phase noise from low‐sampling density. To handle the issue of limited paired training datasets, a generative adversarial network is performed to implicitly learn the distribution underlying Doppler phase noise and to generate the synthetic data. Then a 3D based convolutional neural network is trained and applied for the image denoising. We demonstrate this approach outperforms traditional denoise methods in noise reduction and image details preservation, enabling high speed ODT imaging with low A‐scan sampling density.image

中文翻译:

一种基于深度学习的方法,用于减少高速光学相干多普勒层析成像中的噪声。

光学相干多普勒层析成像(ODT)由于其在高对比度,毛细管水平分辨率和流速定量方面的空前优势而日益受到关注。但是,ODT图像的信噪比与A扫描采样密度之间的权衡会大大降低成像速度,从而限制了其临床应用。为了加速ODT成像,提出了一种基于深度学习的方法,以抑制低采样密度引起的压倒性相位噪声。为了处理有限的成对训练数据集的问题,执行了生成对抗网络以隐式地学习多普勒相位噪声的基础分布并生成合成数据。然后训练基于3D的卷积神经网络并将其应用于图像去噪。图片
更新日期:2020-07-10
down
wechat
bug