Skip to main content
Log in

What does it mean to provide decision support to a responsible and competent expert?

The case of diagnostic decision support systems

  • Original Article
  • Published:
EURO Journal on Decision Processes

Abstract

Decision support consists in helping a decision-maker to improve his/her decisions. However, clients requesting decision support are often themselves experts and are often taken by third parties and/or the general public to be responsible for the decisions they make. This predicament raises complex challenges for decision analysts, who have to avoid infringing upon the expertise and responsibility of the decision-maker. The case of diagnosis decision support in healthcare contexts is particularly illustrative. To support clinicians in their work and minimize the risk of medical error, various decision support systems have been developed, as part of information systems that are now ubiquitous in healthcare contexts. To develop, in collaboration with the hospitals of Lyon, a diagnostic decision support system for day-to-day customary consultations, we propose in this paper a critical analysis of current approaches to diagnostic decision support, which mainly consist in providing them with guidelines or even full-fledged diagnosis recommendations. We highlight that the use of such decision support systems by physicians raises responsibility issues, but also that it is at odds with the needs and constraints of customary consultations. We argue that the historical choice to favor guidelines or recommendations to physicians implies a very specific vision of what it means to support physicians, and we argue that the flaws of this vision partially explain why current diagnostic decision support systems are not accepted by physicians in their application to customary situations. Based on this analysis, we propose that decision support to physicians for customary cases should be deployed in an “adjustive” approach, which consists in providing physicians with the data on patients they need, when they need them, during consultations. The rationale articulated in this article has a more general bearing than clinical decision support and bears lessons for decision support activities in other contexts where decision-makers are competent and responsible experts.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2

Similar content being viewed by others

Notes

  1. www.mims.co.

  2. www.nice.org.uk/guidance/ng28.

  3. www.nice.org.uk.

  4. www.has-sante.fr.

  5. http://www.mghlcs.org/projects/dxplain.

  6. Some exceptions exist to the two predominant subsets of DDSSs (Guideline-based and ML-based DDSSs). Gräßer et al. (2017) proposed a DDSS dedicated to providing therapy recommendations, based not on expert guidelines or machine learning algorithms, but on similarity measures between the current case and previous ones, computed for each new cases, without any learning process involved. Whereas this system is akin to ML-based DDSSs, it does not use ML algorithms. Similarly, Giordanengo et al. (2019) proposed a DDSS dedicated to presenting self-collected data on patients and reminders of actions to do to physicians during the consultations of patients with diabetes. In this work, Giordanengo et al. (2019) did not use the guidelines of any health authority but included physicians in the development process of the DDSS to establish rules to apply in specific situations. In addition, the recommendations established by consensus among the physicians involved are not intended for other physicians, but to developers adding needed features into the DDSS. Lastly, the ML-based DDSS proposed by Simon et al. (2019) does not use ML algorithms to make recommendations but to detect complex concepts in medical documents, facilitating access to information on patients or to reference documents. With this DDSS, Simon et al. (2019) showed that it is possible to use ML algorithms in other ways than by producing recommendations, while still providing support to physicians in practice.

  7. The emerging field of Explainable AI (Doran et al. 2017; Gunning 2017; Rudin and Radin 2019) holds promises to mitigate this problem.

  8. https://futureoflife.org/ai-principles/.

References

Download references

Acknowledgements

This work was made in collaboration with employees of the hospitals of Lyon. Thanks to all of them. Special thanks to Pr. Moulin and Dr. Riou for their suggestions and instructive discussions. Special thanks also to J. Rouchier, O. Cailloux, and P. Grill for their advices and comments on earlier versions of this manuscript, and to P. Castets for his support in implementing this project. We also thank two anonymous reviewers of the journal for their powerful and exacting comments and criticisms.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yves Meinard.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Richard, A., Mayag, B., Talbot, F. et al. What does it mean to provide decision support to a responsible and competent expert?. EURO J Decis Process 8, 205–236 (2020). https://doi.org/10.1007/s40070-020-00116-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s40070-020-00116-7

Keywords

Mathematics Subject Classification

Navigation