Skip to main content

Advertisement

Log in

Scene-dependent, feedforward eye gaze metrics can differentiate technical skill levels of trainees in laparoscopic surgery

  • Dynamic Manuscript
  • Published:
Surgical Endoscopy Aims and scope Submit manuscript

Abstract

Introduction

In laparoscopic surgery, looking in the target areas is an indicator of proficiency. However, gaze behaviors revealing feedforward control (i.e., looking ahead) and their importance have been under-investigated in surgery. This study aims to establish the sensitivity and relative importance of different scene-dependent gaze and motion metrics for estimating trainee proficiency levels in surgical skills.

Methods

Medical students performed the Fundamentals of Laparoscopic Surgery peg transfer task while recording their gaze on the monitor and tool activities inside the trainer box. Using computer vision and fixation algorithms, five scene-dependent gaze metrics and one tool speed metric were computed for 499 practice trials. Cluster analysis on the six metrics was used to group the trials into different clusters/proficiency levels, and ANOVAs were conducted to test differences between proficiency levels. A Random Forest model was trained to study metric importance at predicting proficiency levels.

Results

Three clusters were identified, corresponding to three proficiency levels. The correspondence between the clusters and proficiency levels was confirmed by differences between completion times (F2,488 = 38.94, p < .001). Further, ANOVAs revealed significant differences between the three levels for all six metrics. The Random Forest model predicted proficiency level with 99% out-of-bag accuracy and revealed that scene-dependent gaze metrics reflecting feedforward behaviors were more important for prediction than the ones reflecting feedback behaviors.

Conclusion

Scene-dependent gaze metrics revealed skill levels of trainees more precisely than between experts and novices as suggested in the literature. Further, feedforward gaze metrics appeared to be more important than feedback ones at predicting proficiency.

Graphical abstract

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Zendejas B, Ruparel RK, Cook DA (2016) Validity evidence for the fundamentals of laparoscopic surgery (FLS) program as an assessment tool: a systematic review. Surg Endosc 30:512–520. https://doi.org/10.1007/s00464-015-4233-7

    Article  PubMed  Google Scholar 

  2. Benoit LJ, Travis CI (2017) Focusing on formative assessments: a step in the right direction copyright © by the association of American medical colleges. Unauthorized reproduction of this article is prohibited. Acad Med 92:2017

    Google Scholar 

  3. Van Hove PD, Tuijthof GJM, Verdaasdonk EGG, Stassen LPS, Dankelman J (2010) Objective assessment of technical surgical skills. Br J Surg 97:972–987. https://doi.org/10.1002/bjs.7115

    Article  PubMed  Google Scholar 

  4. Oropesa I, Chmarra MK, Sánchez-González P, Lamata P, Rodrigues SP, Enciso S, Sánchez-Margallo FM, Jansen FW, Dankelman J, Gómez EJ (2013) Relevance of motion-related assessment metrics in laparoscopic surgery. Surg Innov 20:299–312. https://doi.org/10.1177/1553350612459808

    Article  PubMed  Google Scholar 

  5. Mason JD, Ansell J, Warren N, Torkington J (2013) Is motion analysis a valid tool for assessing laparoscopic skill? Surg Endosc 27:1468–1477. https://doi.org/10.1007/s00464-012-2631-7

    Article  PubMed  Google Scholar 

  6. Oropesa I, Sánchez-González P, Lamata P, Chmarra MK, Pagador JB, Sánchez-Margallo JA, Sánchez-Margallo FM, Gómez EJ (2011) Methods and tools for objective assessment of psychomotor skills in laparoscopic surgery. J Surg Res 171(1):e81

    Article  PubMed  Google Scholar 

  7. Yamaguchi S, Konishi K, Yasunaga T, Yoshida D, Kinjo N, Kobayashi K, Ieiri S, Okazaki K, Nakashima H, Tanoue K, Maehara Y, Hashizume M (2007) Construct validity for eye-hand coordination skill on a virtual reality laparoscopic surgical simulator. Surg Endosc Other Interv Tech 21:2253–2257. https://doi.org/10.1007/s00464-007-9362-1

    Article  Google Scholar 

  8. Aggarwal R, Crochet P, Dias A, Misra A, Ziprin P, Darzi A (2009) Development of a virtual reality training curriculum for laparoscopic cholecystectomy. Br J Surg 96:1086–1093. https://doi.org/10.1002/bjs.6679

    Article  CAS  PubMed  Google Scholar 

  9. Chmarra MK, Klein S, De Winter JCF, Jansen FW, Dankelman J (2010) Objective classification of residents based on their psychomotor laparoscopic skills. Surg Endosc 24:1031–1039. https://doi.org/10.1007/s00464-009-0721-y

    Article  PubMed  Google Scholar 

  10. Oropesa I, Sánchez-González P, Chmarra MK, Lamata P, Fernández Á, Sánchez-Margallo JA, Jansen FW, Dankelman J, Sánchez-Margallo FM, Gómez EJ (2013) EVA: laparoscopic instrument tracking based on endoscopic video analysis for psychomotor skills assessment. Surg Endosc 27:1029–1039. https://doi.org/10.1007/s00464-012-2513-z

    Article  PubMed  Google Scholar 

  11. Pagador JB, Sánchez-Margallo FM, Sánchez-Peralta LF, Sánchez-Margallo JA, Moyano-Cuevas JL, Enciso-Sanz S, Usón-Gargallo J, Moreno J (2012) Decomposition and analysis of laparoscopic suturing task using tool-motion analysis (TMA): Improving the objective assessment. Int J Comput Assist Radiol Surg 7:305–313. https://doi.org/10.1007/s11548-011-0650-9

    Article  CAS  PubMed  Google Scholar 

  12. Sánchez-Margallo JA, Sánchez-Margallo FM, Oropesa I, Enciso S, Gómez EJ (2017) Objective assessment based on motion-related metrics and technical performance in laparoscopic suturing. Int J Comput Assist Radiol Surg 12:307–314. https://doi.org/10.1007/s11548-016-1459-3

    Article  PubMed  Google Scholar 

  13. Brunyé TT, Drew T, Weaver DL, Elmore JG (2019) A review of eye tracking for understanding and improving diagnostic interpretation. Cogn Res Princ Implic. https://doi.org/10.1186/s41235-019-0159-2

    Article  PubMed  PubMed Central  Google Scholar 

  14. Schriver AT, Morrow DG, Wickens CD, Talleur DA (2008) Expertise differences in attentional strategies related to pilot decision making. Hum Factors 50:864–878. https://doi.org/10.1518/001872008X374974

    Article  PubMed  Google Scholar 

  15. Ashraf H, Sodergren MH, Merali N, Mylonas G, Singh H, Darzi A (2018) Eye-tracking technology in medical education: a systematic review. Med Teach 40:62–69. https://doi.org/10.1080/0142159X.2017.1391373

    Article  PubMed  Google Scholar 

  16. Tien T, Pucher PH, Sodergren MH, Sriskandarajah K, Yang GZ, Darzi A (2014) Eye tracking for skills assessment and training: a systematic review. J Surg Res 191:169–178. https://doi.org/10.1016/j.jss.2014.04.032

    Article  PubMed  Google Scholar 

  17. Erridge S, Ashraf H, Purkayastha S, Darzi A, Sodergren MH (2018) Comparison of gaze behaviour of trainee and experienced surgeons during laparoscopic gastric bypass. Br J Surg 105:287–294. https://doi.org/10.1002/bjs.10672

    Article  CAS  PubMed  Google Scholar 

  18. Di Stasi LL, Diaz-Piedra C, Rieiro H, Sánchez Carrión JM, Martin Berrido M, Olivares G, Catena A (2016) Gaze entropy reflects surgical task load. Surg Endosc 30:5034–5043. https://doi.org/10.1007/s00464-016-4851-8

    Article  PubMed  Google Scholar 

  19. Richstone L, Schwartz MJ, Seideman C, Cadeddu J, Marshall S, Kavoussi LR (2010) Eye metrics as an objective assessment of surgical skill. Ann Surg 252:177–182. https://doi.org/10.1097/SLA.0b013e3181e464fb

    Article  PubMed  Google Scholar 

  20. Eivazi S, Hafez A, Fuhl W, Afkari H, Kasneci E, Lehecka M, Bednarik R (2017) Optimal eye movement strategies: a comparison of neurosurgeons gaze patterns when using a surgical microscope. Acta Neurochir (Wien) 159:959–966. https://doi.org/10.1007/s00701-017-3185-1

    Article  PubMed  Google Scholar 

  21. Gunawardena N, Matscheko M, Anzengruber B, Ferscha A, Schobesberger M, Shamiyeh A, Klugsberger B, Solleder P (2019) Assessing surgeons’ skill level in laparoscopic cholecystectomy using eye metrics. Eye Track Res Appl Symp doi 10(1145/3314111):3319832

    Google Scholar 

  22. Law B, Lomax AJ, Atkins MS, Mackenzie CL, Kirkpatrick AE (2004) Eye gaze patterns differentiate novice and experts in a virtual laparoscopic surgery training environment. Eye Track Res Appl Symp 1:41–47. https://doi.org/10.1145/968363.968370

    Article  Google Scholar 

  23. Wilson M, McGrath J, Vine S, Brewer J, Defriend D, Masters R (2010) Psychomotor control in a virtual laparoscopic surgery training environment: gaze control parameters differentiate novices from experts. Surg Endosc 24:2458–2464. https://doi.org/10.1007/s00464-010-0986-1

    Article  PubMed  PubMed Central  Google Scholar 

  24. Fichtel E, Lau N, Park J, Henrickson Parker S, Ponnala S, Fitzgibbons S, Safford SD (2019) Eye tracking in surgical education: gaze-based dynamic area of interest can discriminate adverse events and expertise. Surg Endosc 33:2249–2256. https://doi.org/10.1007/s00464-018-6513-5

    Article  PubMed  Google Scholar 

  25. Deng S, Kulkarni C, Wang T, Hartman-kenzler J, Barnes LE (2021) Differentiating laparoscopic skills of trainees with computer vision based metrics. SAGE Publications, California

    Book  Google Scholar 

  26. Cesqui B, Mezzetti M, Lacquaniti F, D’Avella A (2015) Gaze behavior in one-handed catching and its relation with interceptive performance: what the eyes can’t tell. PLoS ONE 10:1–39. https://doi.org/10.1371/journal.pone.0119445

    Article  CAS  Google Scholar 

  27. Johansson RS, Ba A (2001) Eye–hand coordination in object manipulation. J Neurosci 21:6917–6932

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  28. Wilmut K, Wann JP, Brown JH (2006) How active gaze informs the hand in sequential pointing movements. Exp Brain Res. https://doi.org/10.1007/s00221-006-0580-x

    Article  PubMed  Google Scholar 

  29. Liu S, Donaldson R, Subramaniam A, Palmer H, Champion C, Cox M, Appelbaum LG (2020) Skill acquisition and gaze behavior during laparoscopic surgical simulation. Biorxiv. https://doi.org/10.1101/2020.07.17.206763

    Article  PubMed  PubMed Central  Google Scholar 

  30. Kawka M, Gall TM, Fang C, Liu R, Jiao LR (2021) Intraoperative video analysis and machine learning models will change the future of surgical training. Intell Surg. https://doi.org/10.1016/j.isurg.2021.03.001

    Article  Google Scholar 

  31. Kitaguchi D, Takeshita N, Matsuzaki H, Takano H, Owada Y, Enomoto T, Oda T, Miura H, Yamanashi T, Watanabe M, Sato D, Sugomori Y, Hara S, Ito M (2020) Real-time automatic surgical phase recognition in laparoscopic sigmoidectomy using the convolutional neural network-based deep learning approach. Surg Endosc 34:4924–4931. https://doi.org/10.1007/s00464-019-07281-0

    Article  PubMed  Google Scholar 

  32. Lee D, Yu HW, Kwon H, Kong HJ, Lee KE, Kim HC (2020) Evaluation of surgical skills during robotic surgery by deep learning-based multiple surgical instrument tracking in training and actual operations. J Clin Med 9:1–15. https://doi.org/10.3390/jcm9061964

    Article  Google Scholar 

  33. Anteby R, Horesh N, Soffer S, Zager Y, Barash Y, Amiel I, Rosin D, Gutman M, Klang E (2021) Deep learning visual analysis in laparoscopic surgery: a systematic review and diagnostic test accuracy meta-analysis. Surg Endosc 35:1521–1533. https://doi.org/10.1007/s00464-020-08168-1

    Article  PubMed  Google Scholar 

  34. Lavanchy JL, Zindel J, Kirtac K, Twick I, Hosgor E, Candinas D, Beldi G (2021) Automation of surgical skill assessment using a three-stage machine learning algorithm. Sci Rep 11:1–9. https://doi.org/10.1038/s41598-021-84295-6

    Article  CAS  Google Scholar 

  35. Menekse Dalveren GG, Cagiltay NE (2020) Distinguishing intermediate and novice surgeons by eye movements. Front Psychol 11:1–10. https://doi.org/10.3389/fpsyg.2020.542752

    Article  Google Scholar 

  36. Ridler TW, Calvard S (1978) Picture thresholding using an iterative selection method. IEEE Trans Syst Man Cybern SMC. https://doi.org/10.1109/tsmc.1978.4310039

    Article  Google Scholar 

  37. Newell A, Yang K, Deng J (2016) Stacked hourglass networks for human pose estimation. Lect Notes Comput Sci. https://doi.org/10.1007/978-3-319-46484-8_29

    Article  Google Scholar 

  38. Zhao Z, Cai T, Chang F, Cheng X (2019) Real-time surgical instrument detection in robot-assisted surgery using a convolutional neural network cascade. Healthc Technol Lett 6:275–279. https://doi.org/10.1049/htl.2019.0064

    Article  PubMed  PubMed Central  Google Scholar 

  39. Engbert R, Kliegl R (2003) Microsaccades uncover the orientation of covert attention. Vision Res 43:1035–1045. https://doi.org/10.1016/S0042-6989(03)00084-1

    Article  PubMed  Google Scholar 

  40. Salvucci DD, Goldberg JH (2000) Identifying fixations and saccades in eye-tracking protocols. Proc Eye Track Res Appl Symp 2000:71–78. https://doi.org/10.1145/355017.355028

    Article  Google Scholar 

  41. Arthur D, Vassilvitskii S (2007) K-means++: The advantages of careful seeding. California, Stanford

    Google Scholar 

  42. Pavlov YL (2019) Random forests random for. Springer, Cham, pp 1–122

    Google Scholar 

  43. Kook JS, Narayanan MS, Singhal P, Garimella S, Krovi V (2013) Evaluation of robotic minimally invasive surgical skills using motion studies. J Robot Surg 7:241–249. https://doi.org/10.1007/s11701-013-0419-y

    Article  Google Scholar 

  44. Eivazi S, Bednarik R, Tukiainen M, Von Und Zu, Fraunberg M, Leinonen V, Jääskeläinen JE (2012) Gaze behaviour of expert and novice microneurosurgeons differs during observations of tumor removal recordings. Eye Track Res Appl Symp 1:377–380. https://doi.org/10.1145/2168556.2168641

    Article  Google Scholar 

  45. Batmaz AU, de Mathelin M, Dresp-Langley B (2016) Getting nowhere fast: trade-off between speed and precision in training to execute image-guided hand-tool movements. BMC Psychol 4:1–19. https://doi.org/10.1186/s40359-016-0161-0

    Article  Google Scholar 

  46. Dresp-Langley B (2018) Towards expert-based speed-precision control in early simulator training for novice surgeons. Inf 9:1–13. https://doi.org/10.3390/info9120316

    Article  Google Scholar 

Download references

Acknowledgements

This research was supported in part by the National Center for Advancing Translational Sciences of the National Institutes of Health under Award Number UL1TR003015. The authors thanked the Carilion Clinic personnel who volunteered to support our data collection.

Funding

This research was supported in part by the National Center for Advancing Translational Sciences of the National Institutes of Health under Award Number UL1TR003015.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nathan Lau.

Ethics declarations

Disclosures

Chaitanya S Kulkarni, Shiyu Deng, Tianzi Wang, Jacob Hartman-Kenzler, Laura E. Barnes, Sarah Henrickson Parker, Shawn D. Safford, and Nathan Lau have no conflict of interest or financial ties to disclose.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (MP4 53168 KB)

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kulkarni, C.S., Deng, S., Wang, T. et al. Scene-dependent, feedforward eye gaze metrics can differentiate technical skill levels of trainees in laparoscopic surgery. Surg Endosc 37, 1569–1580 (2023). https://doi.org/10.1007/s00464-022-09582-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00464-022-09582-3

Keywords

Navigation