Medical Education
Sep 2020

Believing in Overcoming Cognitive Biases

Tiffany S. Doherty, PhD and Aaron E. Carroll, MD, MS
AMA J Ethics. 2020;22(9):E773-778. doi: 10.1001/amajethics.2020.773.

Abstract

Like all humans, health professionals are subject to cognitive biases that can render diagnoses and treatment decisions vulnerable to error. Learning effective debiasing strategies and cultivating awareness of confirmation, anchoring, and outcomes biases and the affect heuristic, among others, and their effects on clinical decision making should be prioritized in all stages of education.

Introduction

Cognitive biases contribute significantly to diagnostic and treatment errors.1,2 A 2016 review of their roles in decision making lists 4 domains of concern for physicians: gathering and interpreting evidence, taking action, and evaluating decisions.3 Although experts have identified many different types of cognitive biases, specific examples from these domains include confirmation bias, anchoring bias, the affect heuristic, and outcomes bias. In this article, we first discuss these biases, how they affect medical decision making, and how cognitive psychology helps to inform effective debiasing strategies. We then discuss specific debiasing strategies and how to integrate them into education.

Examples of Cognitive Biases

Confirmation bias is the selective gathering and interpretation of evidence consistent with current beliefs and the neglect of evidence that contradicts them.4 It can occur when a physician refuses to consider alternative diagnoses once an initial diagnosis has been established, despite contradicting data, such as lab results. This bias leads physicians to see what they want to see. Since it occurs early in the treatment pathway, confirmation bias can lead to mistaken diagnoses being passed on to and accepted by other clinicians without their validity being questioned, a process referred to as diagnostic momentum.5

Anchoring bias is closely related to confirmation bias and comes into play when interpreting evidence. It refers to physicians’ practices of prioritizing information and data that support their initial impressions, even when first impressions are wrong. It often manifests when the first piece of information given to a physician is relied upon too heavily when making decisions.3 For example, a patient’s back pain might be attributed to known osteoporosis without ruling out other potential causes.

When physicians move from deliberation to action, they are sometimes swayed by emotional reactions rather than rational deliberation about risks and benefits. This is called the affect heuristic, and, while heuristics can often serve as efficient approaches to problem solving, they can sometimes lead to bias.3 The affect heuristic is context or patient specific and can manifest when physicians label patients as “complainers” or when they experience positive or negative feelings toward a patient, based on prior experiences.6

Further down the treatment pathway, outcomes bias can come into play. This bias refers to the practice of believing that good or bad results are always attributable to prior decisions, even when there is no valid reason to do so.3 Feedback on clinical decisions is critical for identifying weaknesses or potential mistakes, so this type of bias can prevent clinicians from taking into account appropriate feedback to improve future performance. Although the relation between decisions and outcomes might seem intuitive, the outcome of a decision cannot be the sole determinant of its quality; that is, sometimes a good outcome can happen despite a poor clinical decision, and vice versa.

Metacognition and Clinical Decision Making

We can help mitigate failures of clinical reasoning by helping physicians and trainees cultivate insight into their own thinking processes. The dual-process theory, a cognitive model of reasoning, can be particularly relevant in matters of clinical decision making.7,8 This theory is based on the argument that we use 2 different cognitive systems, intuitive and analytical, when reasoning. The former is quick and uses information that is readily available; the latter is slower and more deliberate.

We more commonly use intuitive thinking strategies because they are fast and reasonably effective. For example, intuitive thinking would likely lead to a flu diagnosis for a patient presenting with fever, fatigue, and joint pain during winter months. However, compared with analytical thinking strategies, intuitive strategies are much more prone to error. For example, jumping to a diagnosis of influenza might cause one to neglect to investigate other diagnoses for that patient (eg, meningococcal meningitis) because it’s flu season. Intuitive strategies benefit from experience and are necessary in situations in which time and information are lacking (eg, in emergency rooms). These strategies rely on heuristics, or mental shortcuts that are generally sufficient, but not guaranteed, to lead to the right answer. In contrast, analytical strategies require more time and resources but allow the use of deductive logic to reach a diagnostic or treatment decision that is less subject to external factors (eg, previous experience, test availability).9 Effective debiasing strategies mainly involve a deliberate switch between these 2 types of thinking.

Consideration should be given to the difficulty physicians face in employing analytical thinking exclusively. Beyond constraints of time, information, and resources, many physicians are also likely to be sleep deprived, work in an environment full of distractions, and be required to respond quickly while managing heavy cognitive loads.10 These are working conditions in which analytical thinking strategies are difficult to apply, especially given that they require the cooperation of brain structures that suffer greatly from sleep deprivation.11,12 In such conditions, many physicians default to intuition. However, change is not impossible.

Potential Debiasing Strategies

Pat Croskerry, an expert in clinical decision making, suggests that 3 things must occur in order for improvement in bias-related diagnostic and treatment errors to happen: (1) physicians must fully appreciate the contribution of cognitive biases to errors in medical decision making, (2) they must recognize that such errors are not inevitable, and (3) they must be optimistic that solutions to reduce bias work.1

The practice of reflection reinforces behaviors that reduce bias in complex situations. 

Simply increasing physicians’ familiarity with the many types of cognitive biases—and how to avoid them—may be one of the best strategies to decrease bias-related errors.1 Thus, education for medical students, residents, and fellows could fruitfully invest in training on cognitive biases, the role they play in diagnostic and treatment errors, and effective debiasing strategies. Two such strategies will be discussed below.

The practice of reflection reinforces behaviors that reduce bias in complex situations. A 2016 systematic review of cognitive intervention studies found that guided reflection interventions were associated with the most consistent success in improving diagnostic reasoning.13 A guided reflection intervention involves searching for and being open to alternative diagnoses and willingness to engage in thoughtful and effortful reasoning and reflection on one’s own conclusions, all with supportive feedback or challenge from a mentor.14

The same review suggests that cognitive forcing strategies may also have some success in improving diagnostic outcomes.13,15 These strategies involve conscious consideration of alternative diagnoses other than those that come intuitively. One example involves reading radiographs in the emergency department. According to studies, a common pitfall among inexperienced clinicians in such a situation is to call off the search once a positive finding has been noticed, which often leads to other abnormalities (eg, second fractures) being overlooked. Thus, the forcing strategy in this situation would be to continue a search even after an initial fracture has been detected.15

While some data suggest that cognitive forcing strategies are not successful in reducing students’ diagnostic errors,16,17 a systematic review reveals that they can be efficacious in specific circumstances (eg, telling participants to consider alternative diagnoses rather than to be aware of misleading details).13 Overall, more research is needed to understand how other factors (eg, study setting, participant experience or knowledge level, bias or strategy introduction) influence cognitive forcing strategies’ effectiveness.

Using guided reflection and cognitive forcing strategies, medical trainees at all stages can be taught to acknowledge the risk of potential biases during decision making and then to deliberately counteract those potential biases. It is thought that, given time and sustained practice, certain metacognitive strategies can become second nature to physicians.15

Delivery Formats in Health Professions

In terms of format, cognitive tutoring systems may be useful. A 2013 study investigated the ability of a computer-based system, which involved virtual slides and a diagnostic reasoning interface, to detect and measure heuristics and biases in pathologists at different levels of training.18 The authors reported that biases and their association with diagnostic errors were successfully detected using this virtual slide system, suggesting that such a system could be used in the future to test methods for decreasing bias-related errors.

Another potentially useful format is simulation. A 2004 study with residents simulated a case with a difficult diagnosis and a cognitive error trap.19 Afterwards, the resident physicians were debriefed on both case-specific details and on cognitive forcing strategies, interviewed, and asked to complete a written survey. The results suggested that resident physicians further along in their training (ie, postgraduate year three) gained more awareness of cognitive strategies than resident physicians in earlier years of training, suggesting that this tool could be more useful after a certain level of training has been completed. Future research should assess whether strategies learned from such simulations are applied later in bias-prone medical decisions.

Training formats such as workshops or seminars might also be effective formats. A 60-minute workshop was conducted at the 2017 meeting of the Society for Academic Emergency Medicine that consisted of brief instruction on cognitive biases and debiasing strategies. The workshop significantly improved recognition of bias and application of debiasing strategies.20 Although this intervention seems promising, future studies should examine the effects of such workshops using measures less subjective than self-assessment.

A seminar conducted at Wright State University with medical students and internal medicine resident physicians focused on cognitive bias in medical decision making using an objective method of assessment.21 There is evidence that participation in the seminar improved scores on the Inventory of Cognitive Biases in Medicine (ICBM), an instrument used to detect the impact of such biases on analytical thinking.22 It is important to note that the validity of the ICBM has since been questioned.23 Reliable measurement tools will be critical to implementing effective educational measures.

Alternatively, or perhaps in addition to the aforementioned formats, education on cognitive biases and debiasing strategies could be delivered in longer formats. A 2013 study examined the effect of a 3-part, 1-year curriculum on recognition and knowledge of cognitive biases and debiasing strategies in second-year residents.24 Those who completed the entire curriculum not only improved on their precurriculum scores but also performed better than third-year resident physicians who had not completed the curriculum.

Conclusion

Cognitive biases in clinical practice have a significant impact on care, often in negative ways. They sometimes manifest as physicians seeing what they want to see rather than what is actually there. Or they come into play when physicians make snap decisions and then prioritize evidence that supports their conclusions, as opposed to drawing conclusions from evidence. Sometimes physicians’ previous experiences can lead them astray. And, if outcomes are falsely attributed to decisions or actions, critical feedback opportunities are lost and bad habits can become ingrained.

Fortunately, cognitive psychology provides insight into how to prevent biases. Guided reflection and cognitive forcing strategies deflect bias through close examination of our own thinking processes. Although more research is required, data suggest that these strategies can be successful in the right circumstances. If they are to work, we must consistently include them in medical curricula. During medical education and consistently thereafter, we must provide physicians with a full appreciation of the cost of biases and the potential benefits of combatting them.

References

  1. Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med. 2003;78(8):775-780.
  2. Stiegler MP, Neelankavil JP, Canales C, Dhillon A. Cognitive errors detected in anaesthesiology: a literature review and pilot study. Br J Anaesth. 2012;108(2):229-235.
  3. Molony DA. Cognitive bias and the creation and translation of evidence into clinical practice. Adv Chronic Kidney Dis. 2016;23(6):346-350.
  4. Nickerson RS. Confirmation bias: a ubiquitous phenomenon in many guises. Rev Gen Psychol. 1998;2(2):175-220.
  5. Satya-Murti S, Lockhart J. Recognizing and reducing cognitive bias in clinical and forensic neurology. Neurol Clin Pract. 2015;5(5):389-396.
  6. Croskerry P, Abbass AA, Wu AW. How doctors feel: affective issues in patients’ safety. Lancet. 2008;372(9645):1205-1206.
  7. Pelaccia T, Tardif J, Triby E, Charlin B. An analysis of clinical reasoning through a recent and comprehensive approach: the dual-process theory. Med Educ Online. 2011;16(1):5890.

  8. Croskerry P, Stephen M, Cosby K, Wears R. Critical Thinking and Reasoning in Emergency Medicine. Philadelphia, PA: Wolters Kluwer/Lippincott Williams & Wilkins; 2008.

  9. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ Qual Saf. 2013;22(suppl 2):ii58-ii64.
  10. Croskerry P. A universal model of diagnostic reasoning. Acad Med. 2009;84(8):1022-1028.
  11. Lieberman MD, Jarcho JM, Satpute AB. Evidence-based and intuition-based self-knowledge: an FMRI study. J Pers Soc Psychol. 2004;87(4):421-435.
  12. Goel N, Rao H, Durmer JS, Dinges DF. Neurocognitive consequences of sleep deprivation. Semin Neurol. 2009;29(4):320-339.
  13. Lambe KA, O’Reilly G, Kelly BD, Curristan S. Dual-process cognitive interventions to enhance diagnostic reasoning: a systematic review. BMJ Qual Saf. 2016;25(10):808-820.
  14. Mamede S, Schmidt HG. The structure of reflective practice in medicine. Med Educ. 2004;38(12):1302-1308.
  15. Croskerry P. Cognitive forcing strategies in clinical decisionmaking. Ann Emerg Med. 2003;41(1):110-120.
  16. Sherbino J, Yip S, Dore KL, Siu E, Norman GR. The effectiveness of cognitive forcing strategies to decrease diagnostic error: an exploratory study. Teach Learn Med. 2011;23(1):78-84.
  17. Sherbino J, Kulasegaram K, Howey E, Norman G. Ineffectiveness of cognitive forcing strategies to reduce biases in diagnostic reasoning: a controlled trial. CJEM. 2014;16(1):34-40.
  18. Crowley RS, Legowski E, Medvedeva O, et al. Automated detection of heuristics and biases among pathologists in a computer-based system. Adv Health Sci Educ Theory Pract. 2013;18(3):343-363.
  19. Bond WF, Deitrick LM, Arnold DC, et al. Using simulation to instruct emergency medicine residents in cognitive forcing strategies. Acad Med. 2004;79(5):438-446.
  20. Daniel M, Carney M, Khandelwal S, et al. Cognitive debiasing strategies: a faculty development workshop for clinical teachers in emergency medicine. MedEdPORTAL. 2017;13:10646.

  21. Hershberger PJ, Part HM, Markert RJ, Cohen SM, Finger WW. Teaching awareness of cognitive bias in medical decision making. Acad Med. 1995;70(8):661.

  22. Hershberger PJ, Markert RJ, Part HM, Cohen SM, Finger WW. Understanding and addressing cognitive bias in medical education. Adv Health Sci Educ Theory Pract. 1996;1(3):221-226.
  23. Sladek RM, Phillips PA, Bond MJ. Measurement properties of the Inventory of Cognitive Bias in Medicine (ICBM). BMC Med Inform Decis Mak. 2008;8:20.

  24. Reilly JB, Ogdie AR, Von Feldt JM, Myers JS. Teaching about how doctors think: a longitudinal curriculum in cognitive bias and diagnostic error for residents. BMJ Qual Saf. 2013;22(12):1044-1050.

Citation

AMA J Ethics. 2020;22(9):E773-778.

DOI

10.1001/amajethics.2020.773.

Conflict of Interest Disclosure

The author(s) had no conflicts of interest to disclose.

The viewpoints expressed in this article are those of the author(s) and do not necessarily reflect the views and policies of the AMA.