Epistemic risks in cancer screening: Implications for ethics and policy

https://doi.org/10.1016/j.shpsc.2019.101200Get rights and content

Highlights

  • In 2018, the USPSTF recommended that men make individualized, autonomy-based decisions about prostate cancer screening.

  • Risk assessment for prostate cancer is pervaded by epistemic risks that reflect value judgments.

  • The pervasiveness of these epistemic risks creates under-explored difficulties for physician-patient communication.

  • Autonomous patient decision making will be difficult to achieve without significant organizational changes.

Abstract

Cancer screening is the subject of much debate; while screening has the potential to save lives by identifying and treating cancers in early stages, it is also the case that not all cancers cause symptoms, and the diagnosis of these cancers can lead to unnecessary treatments and subsequent side-effects and complications. This paper explores the relationships between epistemic risks in cancer diagnosis and screening, the social organization of medical research and practice, and policy making; it does this by examining 2018 recommendations by the United States Preventative Services Task Force that patients make individualized, autonomy-based decisions about cancer screening on the basis of discussions with their physicians. While the paper focuses on prostate cancer screening, the issues that it raises are relevant to other cancer screening programs, especially breast cancer. The paper argues that prostate cancer screening—and, more generally, the process of risk assessment for prostate cancer—is pervaded by epistemic risks that reflect value judgments and that the pervasiveness of these epistemic risks creates significant and under-explored difficulties for physician-patient communication and the achievement of autonomous patient decision making.

Introduction

Cancer screening—or testing for cancer in the absence of symptoms—is the subject of much debate; while screening has the potential to save lives by identifying and treating cancers in early stages, it is also the case that not all cancers cause symptoms, and the diagnosis of these cancers can lead to unnecessary treatments and subsequent side-effects (Klotz, 2013; Loeb et al., 2014; USPSTF, 2018; Welch, Schwartz, & Woloshin, 2011). The debate over cancer screening is part of a larger discussion about overdiagnosis and overtreatment of disease, and at the very least, the debate has highlighted the difficulties involved in balancing the risks of failing to treat against the risks of overtreatment (Hoffman & Cooper, 2012; Moynihan, Doust, & Henry, 2012; Welch et al., 2011).1 This paper will focus on the debate over prostate cancer screening, though many of the issues that it raises are relevant to other cancer screening programs, especially breast cancer (c.f., Kourany and Fernández Pinto, 2018; Plutynski, 2017).

In 2018, the United States Preventative Services Task Force (USPSTF), an independent and influential panel of experts, published recommendations that men between the ages 55 and 69 make individualized decisions about cancer screening on the basis of discussions with their physicians (USPSTF, 2018).2 The 2018 recommendations are updated from earlier ones in 2012 that no man of any age undergo screening. (The 2012 recommendations are consistent with the current ones of the National Health Service (NHS) in the UK (NHS, 2018)). The USPSTF's shift in recommendations, which opens up the door to more men getting screened, is based in part on a shift in values. The 2012 guidelines are paternalistic and reflect the norm of beneficence; the task force determined that it was not in the best interests of men to undergo screening, and as such, it recommended against screening. The 2018 recommendations, by contrast, reflect much more strongly the norm of respect for patient autonomy; they attempt to ensure that men “have an opportunity to discuss the potential benefits and harms of screening and to incorporate their values and preferences in the decision” (USPSTF, 2018, 1902).

Not surprisingly, the shift in USPSTF recommendations is controversial. A small but significant percentage of men die from prostate cancer (approximately 7% worldwide), but there are also many who are overdiagnosed and suffer needlessly from complications from treatment—including erectile dysfunction, urinary incontinence, and bowel symptoms (Ferlay, 2013; USPSTF, 2018; Welch et al., 2011). The aim of the updated recommendations is to promote autonomous patient decision making by encouraging patients to discuss their cases with their physicians and to arrive at decisions that are best for them, given their goals and values. Given this aim, it is noteworthy that the recommendations do not address the question of whether autonomous patient decision making in these situations is realistically achievable.

In this paper, I will argue that prostate cancer screening—and, more generally, the process of risk assessment for prostate cancer—is pervaded by epistemic risks that reflect value judgments and that the pervasiveness of these epistemic risks creates significant and under-explored difficulties for physician-patient communication and the achievement of autonomous patient decision making. I do not address the question of whether patients, under some conditions, are capable of making autonomous and rational decisions; for the purpose of this paper, I assume that they are. Moreover, I do not assume that an autonomy-centered approach is the best of the available alternatives.3 Instead, I argue that the information provided to patients on the basis of risk assessments is likely to be value-laden in ways that are difficult to communicate transparently; because of this, even fully rational patients might not have access to the information they need to make informed decisions. In this regard, the paper raises significant challenges that autonomy-based approaches must overcome, if they are to succeed. If this is indeed the case, then it is an important result, particularly given that there are significant financial incentives for health care providers to nudge patients in the direction of treatment, even when treatment is unnecessary (Biddle, 2016; Moynihan et al., 2012; Welch et al., 2011). Given these incentives, it is important to ensure that patients are not manipulated, under a guise of respect for autonomy, to undergo treatment that is against their interests.

In developing the argument of the paper, I will explore the relationships between epistemic risks in cancer diagnosis, the social organization of medical research and practice, and policy possibilities. More specifically, I will argue that, given the epistemic risks that pervade the risk assessment process in prostate cancer screening, autonomous patient decision making will be difficult to achieve without significant changes in the organization of accompanying clinical research and practice. The paper will thus explore connections between the role of values in science and medicine, social epistemology, and biomedical ethics and policymaking.

The structure of the paper is as follows. Section 2 will introduce the concept of epistemic risk and its implications for accounts of the appropriate role of value judgments in science and risk assessment in medicine. Section 3 will discuss briefly the problems of overdiagnosis and overtreatment of prostate cancer, and Section 4 will examine the processes of prostate cancer diagnosis and screening and argue that, at each stage of the diagnostic process, there are epistemic risks that reflect value judgments. A primary focus of this section will be on the potential role of values in the assignment of Gleason scores to biopsied samples. Section 5 will address the implications of these findings for the organization of clinical research and practice, policy recommendations, and the ethics of prostate cancer screening, and it will raise and respond to two potential objections to the argument of the paper.

Section snippets

Epistemic risk and values in science and medicine

Scientific and medical research are pervaded with epistemic risk, which is defined broadly as the risk of error that arises at any point in knowledge-productive practices (Biddle, 2016, 2018; Biddle & Kukla, 2017; Kukla, 2017). Data points can be mischaracterized or misinterpreted; biased model-organism choices or biased data sets can lead to misleading findings; false conclusions can be drawn from statistical evidence, and so on. Many instances of epistemic risk reflect value judgments. In

Overdiagnosis and overtreatment of prostate cancer

Prostate cancer is the second most common diagnosed cancer in men worldwide. In 2012 (the most recent year for which global data is currently available), there were approximately 1.1 million cases of prostate cancer and 307,000 resulting deaths (Ferlay, 2013). In the United States in 2018, there were approximately 164,690 new cases of prostate cancer and 29,430 deaths (ACS, 2018). While prostate cancer is a significant public health burden, it is also the case that many men are diagnosed and

Epistemic risks in prostate cancer diagnosis

In this section, I will argue that epistemic risks that reflect value judgments are present in at least three stages of prostate cancer diagnosis: (1) the choice of a prostate-specific-antigen (PSA) threshold for taking a biopsy, (2) the decision (in case of a biopsy) of how many samples to take, and (3) the assignment of a Gleason score to biopsied samples. Before proceeding, however, it is important to clarify the relations between decisions about screening and decisions about treatment, and

Obstacles to autonomous patient decision making

A traditional way of thinking about the relations between risk assessment, risk communication, and patient autonomy is based on the ideal that risk assessment can and should be value neutral. According to this way of thinking, patients who are considering undergoing screening should have discussions with their physicians; if patients decide to undergo screening and have their PSA levels tested, then physicians provide the patients with neutral information about their risks (including PSA levels

Conclusion

This paper has explored the relationships between epistemic risks in prostate cancer diagnosis and the set of reasonable policy possibilities, as well as the ways in which these relationships are mediated by the social organization of the risk assessment and communication process. In a 2018 report, the USPSTF recommended that patients considering prostate cancer screening discuss the matter with their physicians and then make autonomous decisions that incorporate their own values and interests (

Acknowledgments

Versions of this paper have been presented at the Philosophy of Cancer workshop at Cambridge University, the Philosophy of Science Association (PSA) Meeting in Seattle, the Society for Philosophy of Science in Practice (SPSP) in Ghent, the meeting of the Consortium for Socially Relevant Philosophy of/in Science and Engineering (SRPoiSE) in Atlanta, the Workshop on Original Policy Research at Georgia Tech, and the Georgia Tech Philosophy Club. Thanks to all of the participants of those meetings

References (41)

  • Blumenthal-Barby et al.

    Toward ethically responsible choice architecture in prostate cancer treatment decision-making

    CA: A Cancer Journal for Clinicians

    (2015)
  • Brown, M.(forthcoming). Is science really value free and objective? From objectivity to scientific integrity. In McCain...
  • C.W. Churchman

    Theory of experimental inference

    (1948)
  • H. Douglas

    Inductive risk and values in science

    Philosophy in Science

    (2000)
  • H. Douglas

    Science, policy, and the value-free ideal

    (2009)
  • H. Douglas

    Why inductive risk requires values in science

  • K. Elliott

    Is a little pollution good for you? Incorporating societal values in environmental research

    (2011)
  • K. Elliott

    A tapestry of values: An introduction to values in science

    (2017)
  • J. Ferlay et al.

    GLOBOCAN 2012 v1.0, cancer incidence and mortality worldwide: IARC CancerBase No. 11 [internet]

    (2013)
  • Cited by (5)

    View full text