Viewpoint
Oct 2019

What Clinical Ethics Can Learn From Decision Science

Michele C. Gornick, PhD, MA and Brian J. Zikmund-Fisher, PhD, MA
AMA J Ethics. 2019;21(10):E906-912. doi: 10.1001/amajethics.2019.906.

Abstract

Many components of decision science are relevant to clinical ethics practice. Decision science encourages thoughtful definition of options, clarification of information needs, and acknowledgement of the heterogeneity of people’s experiences and underlying values. Attention to decision-making processes reminds participants in consultations that how decisions are made and how information is provided can change a choice. Decision science also helps reveal affective forecasting errors (errors in predictions about how one will feel in a future situation) that happen when people consider possible future health states and suggests strategies for correcting these and other kinds of biases. Implementation of decision science innovations is not always feasible or appropriate in ethics consultations, but their uses increase the likelihood that an ethics consultation process will generate choices congruent with patients’ and families’ values.

Elements of Decision Science in Clinical Ethics

When we first raised the idea of connecting decision science to the practice of clinical ethics, we got some strange looks. After all, the phrase decision science might evoke images of mathematical decision trees and computational modeling, whereas the prototypical picture of a clinical ethics consultation is one of health professionals, ethics consultants, patients, and family members gathering to interpret ethical dimensions of health care experiences. From this perspective, there wouldn’t seem to be much overlap.

Yet while few would argue that a mathematical decision tree is critical in ethics consultation, multiple concepts that fall under the broader umbrella of decision science are indeed relevant to clinical ethics practice. Normative decision analysis, which encompasses analytical modeling of decisions and calculation of expected value or decision utility,1,2 provides important reminders that any decision about uncertain risks or benefits requires assessing as precisely as possible the likelihood and severity of all relevant possible outcomes. Informed decision-making standards identify the critical information that stakeholders must know before making their decisions.3 For example, a “reasonable” person standard requires that decision makers know all that a reasonable person would want to know prior to choosing.4 Decision psychology provides insights into the predictable biases that influence people’s perceptions of the health risks they face5,6 and the ways that decision making about risk is simultaneously analytical and emotion driven.7

In particular, clarifying decision-making processes can enable all-important shifts from considering only what needs to be discussed in an ethics consultation to considering how a decision process unfolds and should unfold.8,9 For example, it is important to ask questions such as (1) How do clinicians or patients actually go about the process of making their difficult decisions, both individually and collectively? (2) More specifically, how is the decision process incomplete or biased (eg, due to failures to search for relevant information, recognize relevant options, or incorporate individual perspectives)? (3) How can systematic consideration of individual and collective decision-making processes help to improve outcomes and decrease future regret?

Below, we discuss how key features of high-quality decision-making processes can be applied in clinical ethics.

Good Decision-Making Processes

Decision science suggests that ethics consultations can aspire to support the following characteristics10 of good decision-making processes:

  1. Identify a complete option set. Good decision processes require understanding of the full option set, including inaction when appropriate.10 When parties disagree about the options among which they are choosing, consensus rarely results. Ethics consultants can engage health professionals early in case reviews to ensure that all options (not just those preferred by one stakeholder, for example) are raised for consideration.
  2. Learn about relevant possible outcomes. Good decision processes require information about possible outcomes, in terms of both their likelihood and their character and severity.10 Since most outcomes have multiple components, a good information-gathering process involves clarifying different dimensions of a choice and ways in which outcomes’ severity or likelihood could differ. For example, ethics consultations can help to ensure that all stakeholders learn about and consider issues such as possible changes in quality of life over time, the presence of rare but significant complications, barriers to treatment adherence, or practical implications of different options for the patient or family.
  3. Consider personalized impact of possible outcomes. Good decision processes require recognition that outcomes can be perceived differently by different stakeholders.10 Aside from mortality and morbidity risks of a particular intervention for a particular patient, how good or bad an outcome is for that specific person at that specific time should be considered from that patient’s perspective, not the clinician’s perspective. Ethics consultations can help cultivate opportunities for patients and family members to consider and voice what different outcomes could mean from their perspectives. Asking What would the implications be for you if that were to happen? can promote self-reflection and help patients and family members to concretely envision the impact of different possible outcomes.
  4. Integrate decision makers’ core values. Good decision processes require assigning value and importance to different possible outcomes, trade-offs, or other aspects of a decision.10 It requires decision makers to state, for example, “I care a lot about X” or “Whether Y happens doesn’t matter much to me.” For example, survival is not always valued over other attributes. This stage in a decision-making process is often referred to as values clarification.11 Values clarification references relatively stable values people hold as a result of personal, familial, or cultural experiences with health care and examines how those values inform a specific decision. Ethics consultations can facilitate stakeholders’ reflections about how their values should inform their decisions, especially when the available options reflect trade-offs between short- and long-term outcomes.

Value Congruence

Implementing these 4 steps during ethics consultations tends to produce choices that are values-congruent.11 In other words, what gets chosen tends to align with what decision makers care about. Someone who values maximizing quality of life over quantity of life might choose to pursue hospice care earlier after a terminal diagnosis than someone with different values; this is an example of a values-congruent care plan. Someone who values minimizing pain but chooses to undergo a painful intervention, particularly if less painful options are available, is not receiving values-congruent care. A values-incongruent choice could be made for a number of reasons (eg, misunderstanding an option set, misunderstanding options’ implications) and should probably be regarded as a clinically and ethically problematic outcome of a health care decision process, particularly one that was aided by an ethics consultation.

There are 3 important facts that stakeholders in ethics consultations need to understand about value congruency in health decision making. First, while people’s values tend to be relatively stable (ie, we generally care about the same things in most situations), their preferences are sensitive to context and constructed at the moment of a decision.12,13 Hence, people’s preferences can be influenced by how a decision is framed or how it is made.14 Framing outcomes in terms of chances of survival, for example, can lead to different choices than when the same information is framed in terms of chances of death.5,15,16 Second, preferences can be role dependent: even given the same information, people express different preferences when making decisions for themselves than for others.17,18,19 Finally, societies at large can have preferences that differ from those of individuals.20 Appreciation of these 3 factors can help illuminate how stakeholders’ values and patterns of assigning value to different possible health outcomes play out during ethics consultations.

Barriers to Values-Congruent Decision Making

Pursuing value-congruence in ethics consultation and health care decision-making processes can help to maximize inclusive stakeholder value and minimize decisional regret. Yet there are many reasons why decision making about ethically complex cases might not result in values-congruent outcomes.

First, there can be barriers to gathering all relevant information. Certain options might be excluded from consideration due to external constraints such as insurance rules or patients’ inability to travel. Critical information might be unavailable, or there might be insufficient time to absorb and consider the relevant information. In particular, there is often substantial uncertainty regarding either the likelihood of outcomes or their severity. In truly unusual situations, medical professionals might not know what kinds of outcomes are even possible.

Second, a more general but pernicious barrier to value-congruence is affective forecasting errors.21,22 Health decisions often require people to make choices about states of being with which they have no experience. Depending on context, people might think some outcomes or experiences are much worse or much better than they actually are. Even when people accurately anticipate what a health state or treatment experience will be like for them and how mild or severe it might be, they might not be able to appreciate its impact on their lives or feelings. A key part of decision support, therefore, involves identifying when forecasting errors might occur or be corrected. For example, one approach to addressing affective forecasting errors involves patients who have “been there” sharing their experiences to help address the misperceptions of patients trying to imagine what it would be like for them.23,24

Third, there are also limits to the deference that can or should be accorded some values of some individuals,12,25,26 particularly when those values conflict with other important ethical values. When relevant stakeholders’ values are in conflict, a good decision-making process will clarify when it is differing values, rather than misunderstandings of other information, that is at the heart of the disagreement. At such moments, clarity regarding which stakeholder holds decisional authority is essential.

Not every ethics consultation or medical decision, however, needs to involve a detailed deliberation that elicits every stakeholder’s values in a shared decision-making process. While that vision is a worthy aspiration in many contexts, it is impractical or inappropriate in others. Being aware of potential barriers to effective decision making can help to suggest situation-appropriate approaches when values-congruent decision making is not possible.

Decision Science in Ethics Practice

Clinical ethicists can support informed, value-congruent decision making in ethically complex clinical situations by working with stakeholders to identify and address biases and the kinds of barriers just discussed. Doing so requires constantly comparing actual decision-making processes with ideal decision-making processes, responding to information deficits, and integrating stakeholder values. One key step involves regularly urging clinicians to clarify both available options and possible outcomes and encouraging patients to consider both their values and the possible meanings of different outcomes. Decision science suggests the importance of thoughtful definition of an option set, clarification of the relevance of information, acknowledgement of the heterogeneity of stakeholders’ experiences and values, and acceptance of the plurality of stakeholder perspectives about health experiences and the desirability of health outcomes. In turn, health care deliberations remind decision science that application of these principles will always be complex when decisions pose real and important consequences for stakeholders.

References

  1. Chapman GB, Sonnenberg FA. Decision Making in Health Care: Theory, Psychology, and Applications. Cambridge, UK: Cambridge University Press; 2000.

  2. Diefenbach MA, Miller-Halegoua S, Bowen D, eds. Handbook of Health Decision Science. New York, NY: Springer; 2016.

  3. Braddock CH, Edwards KA, Hasenberg NM, Laidley TL, Levinson W. Informed decision making in outpatient practice: time to get back to basics. J Am Med Assoc. 1999;282(24):2313-2320.
  4. Kravitz RL, Melnikow J. Engaging patients in medical decision making. BMJ. 2001;323(7313):584-585.
  5. Kahneman D. Thinking, Fast and Slow. New York, NY: Farrar, Straus, and Giroux; 2011.

  6. Klein WMP, Stefanek ME. Cancer risk elicitation and communication: lessons from the psychology of risk perception. CA Cancer J Clin. 2007;57(3):147-167.
  7. Slovic P. The Perception of Risk. Sterling, VA: Earthscan Publications; 2000.

  8. Charles C, Gafni A, Whelan T. Shared decision-making in the medical encounter: what does it mean? (or it takes at least two to tango). Soc Sci Med. 1997;44(5):681-692.
  9. Elwyn G, Frosch D, Thomson R, et al. Shared decision making: a model for clinical practice. J Gen Intern Med. 2012;27(10):1361-1367.
  10. Yates JF. Decision Management: How to Assure Better Decisions in Your Company. San Francisco, CA: Jossey-Bass; 2003.

  11. Witteman HO, Scherer LD, Gavaruzzi T, et al. Design features of explicit values clarification methods: a systematic review. Med Decis Making. 2016;36(4):453-471.
  12. Warren C, McGraw AP, Van Boven L. Values and preferences: defining preference construction. Wiley Interdiscip Rev Cogn Sci. 2011;2(2):193-205.
  13. Dhar R, Novemsky N. Beyond rationality: the content of preferences. J Consum Psychol. 2008;18(3):175-178.
  14. Simonson I. Will I like a “medium” pillow? Another look at constructed and inherent preferences. J Consum Psychol. 2008;18(3):155-169.
  15. Malenka DJ, Baron JA, Johansen S, Wahrenberger JW, Ross JM. The framing effect of relative and absolute risk. J Gen Intern Med. 1993;8(10):543-548.
  16. Perneger TV, Agoritsas T. Doctors and patients’ susceptibility to framing bias: a randomized trial. J Gen Intern Med. 2011;26(12):1411-1417.
  17. Zikmund-Fisher BJ, Sarr B, Fagerlin A, Ubel PA. A matter of perspective: choosing for others differs from choosing for yourself in making treatment decisions. J Gen Intern Med. 2006;21(6):618-622.
  18. Ubel PA, Angott AM, Zikmund-Fisher BJ. Physicians recommend different treatments for patients than they would choose for themselves. Arch Intern Med. 2011;171(7):630-634.
  19. Garcia-Retamero R, Galesic M. Doc, what would you do if you were me? On self-other discrepancies in medical decision making. J Exp Psychol Appl. 2012;18(1):38-51.
  20. Batson CD. Prosocial motivation: is it ever truly altruistic? In: Berkowitz L, ed. Advances in Experimental Social Psychology. Vol 20. San Diego, CA: Academic Press; 1987:65-122.

  21. Wilson TD, Gilbert DT. Affective forecasting: knowing what to want. Curr Dir Psychol Sci. 2005;14(3):131-134.
  22. Halpern J, Arnold RM. Affective forecasting: an unrecognized challenge in making serious health decisions. J Gen Intern Med. 2008;23(10):1708-1712.
  23. Shaffer VA, Focella ES, Scherer LD, Zikmund-Fisher BJ. Debiasing affective forecasting errors with targeted, but not representative, experience narratives. Patient Educ Couns. 2016;99(10):1611-1619.
  24. Focella ES, Zikmund-Fisher BJ, Shaffer VA. Could physician use of realistic previews increase treatment adherence and patient satisfaction? Med Decis Making. 2016;36(6):683-685.

  25. Goldman JJ, Shih TL. The limitations of evidence-based medicine: applying population-based recommendations to individual patients. Virtual Mentor. 2011;13(1):26-30.
  26. Whitney SN. A new model of medical decisions: exploring the limits of shared decision making. Med Decis Making. 2003;23(4):275-280.

Citation

AMA J Ethics. 2019;21(10):E906-912.

DOI

10.1001/amajethics.2019.906.

Conflict of Interest Disclosure

The author(s) had no conflicts of interest to disclose.

The viewpoints expressed in this article are those of the author(s) and do not necessarily reflect the views and policies of the AMA.