当前位置: X-MOL 学术Int. J. CARS › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Acoustic signal analysis of instrument-tissue interaction for minimally invasive interventions.
International Journal of Computer Assisted Radiology and Surgery ( IF 2.3 ) Pub Date : 2020-04-22 , DOI: 10.1007/s11548-020-02146-7
Daniel Ostler 1, 2 , Matthias Seibold 1, 2, 3 , Jonas Fuchtmann 1 , Nicole Samm 1, 4 , Hubertus Feussner 1, 4 , Dirk Wilhelm 1, 4 , Nassir Navab 2
Affiliation  

PURPOSE Minimally invasive surgery (MIS) has become the standard for many surgical procedures as it minimizes trauma, reduces infection rates and shortens hospitalization. However, the manipulation of objects in the surgical workspace can be difficult due to the unintuitive handling of instruments and limited range of motion. Apart from the advantages of robot-assisted systems such as augmented view or improved dexterity, both robotic and MIS techniques introduce drawbacks such as limited haptic perception and their major reliance on visual perception. METHODS In order to address the above-mentioned limitations, a perception study was conducted to investigate whether the transmission of intra-abdominal acoustic signals can potentially improve the perception during MIS. To investigate whether these acoustic signals can be used as a basis for further automated analysis, a large audio data set capturing the application of electrosurgery on different types of porcine tissue was acquired. A sliding window technique was applied to compute log-mel-spectrograms, which were fed to a pre-trained convolutional neural network for feature extraction. A fully connected layer was trained on the intermediate feature representation to classify instrument-tissue interaction. RESULTS The perception study revealed that acoustic feedback has potential to improve the perception during MIS and to serve as a basis for further automated analysis. The proposed classification pipeline yielded excellent performance for four types of instrument-tissue interaction (muscle, fascia, liver and fatty tissue) and achieved top-1 accuracies of up to 89.9%. Moreover, our model is able to distinguish electrosurgical operation modes with an overall classification accuracy of 86.40%. CONCLUSION Our proof-of-principle indicates great application potential for guidance systems in MIS, such as controlled tissue resection. Supported by a pilot perception study with surgeons, we believe that utilizing audio signals as an additional information channel has great potential to improve the surgical performance and to partly compensate the loss of haptic feedback.

中文翻译:

器械-组织相互作用的声信号分析,用于微创干预。

目的微创手术(MIS)已成为许多外科手术的标准,因为它可最大程度地减少创伤,降低感染率并缩短住院时间。然而,由于器械的不直观操作和有限的运动范围,在手术工作空间中对物体的操纵可能是困难的。除了机器人辅助系统的优势(例如视野增强或灵巧性提高)之外,机器人技术和MIS技术都带来了一些缺点,例如触觉有限,并且主要依赖于视觉。方法为了解决上述局限性,进行了知觉研究以调查腹腔内声信号的传输是否可以潜在地改善MIS期间的知觉。为了研究这些声信号是否可以用作进一步自动化分析的基础,获取了捕获电外科在不同类型的猪组织上的应用的大型音频数据集。应用了滑动窗口技术来计算对数-mel-频谱图,将其输入到预训练的卷积神经网络中进行特征提取。在中间特征表示上训练了一个全连接层,以对器械-组织相互作用进行分类。结果知觉研究表明,声反馈有潜力改善MIS期间的知觉,并为进一步自动化分析奠定基础。拟议的分类管道在四种类型的器械-组织相互作用(肌肉,筋膜,肝脏和脂肪组织),并获得最高89.9%的top-1准确性。此外,我们的模型能够以86.40%的总体分类精度区分电外科手术模式。结论我们的原理证明表明了MIS引导系统的巨大应用潜力,例如可控制的组织切除。在与外科医生进行的飞行员感知研究的支持下,我们认为将音频信号用作附加信息通道具有巨大的潜力,可以改善手术性能并部分补偿触觉反馈的损失。
更新日期:2020-04-23
down
wechat
bug