State of the Art and Science
May 2020

How Will Artificial Intelligence Affect Patient-Clinician Relationships?

Matthew Nagy, MPH and Bryan Sisk, MD
AMA J Ethics. 2020;22(5):E395-400. doi: 10.1001/amajethics.2020.395.

Abstract

Artificial intelligence (AI) could improve the efficiency and accuracy of health care delivery, but how will AI influence the patient-clinician relationship? While many suggest that AI might improve the patient-clinician relationship, various underlying assumptions will need to be addressed to bring these potential benefits to fruition. Will off-loading tedious work result in less time spent on administrative burden during patient visits? If so, will clinicians use this extra time to engage relationally with their patients? Moreover, given the desire and opportunity, will clinicians have the ability to engage in effective relationship building with their patients? In order for the best-case scenario to become a reality, clinicians and technology developers must recognize and address these assumptions during the development of AI and its implementation in health care.

AI Uncertainty

Artificial intelligence (AI), defined as the capability of a machine to imitate intelligent human behavior,1 promises to become an innovative and disruptive force in medicine. Emerging technologies will have the capacity to extract and analyze clinical and scientific data in a fraction of the time it would take a human physician. For example, a radiologist might view hundreds of thousands of scans throughout her career, while a deep-learning algorithm could incorporate data from millions of scans instantaneously to process an image and highlight abnormalities.2 Similarly, a future oncologist might utilize AI to analyze scientific literature and identify personalized treatments for specific mutations in a patient’s tumor.3,4 While most experts believe AI will facilitate improved technical care for patients in the near future, it is uncertain how these advancing technologies will affect the relationship between patients and clinicians.3,5

A healing patient-clinician relationship is formed by a patient’s and clinician’s mutual trust, respect, and commitment, which relationship continues to strengthen as rapport and mutual understanding develop.6 Establishing and maintaining this healing relationship is central to providing effective care, and strong relationships can improve both a patient’s health care experience and clinical outcomes.7 According to the Institute of Medicine report, Crossing the Quality Chasm: A New Health System for the 21st Century, building and maintaining these relationships is also essential to improving the overall health care system.8 Without the trust that emanates from a healing relationship, patients can experience anxiety, frustration, and second-guessing. Given the importance of building and maintaining these relationships, the integration of emerging technologies into medical care should aim to promote rather than diminish the relationships between clinicians and patients.

Whether AI will harm or help the patient-clinician relationship in the future remains uncertain. Some experts argue that incorporating AI into medical care will enhance the patient-clinician relationship by off-loading tedious work, thus allowing clinicians to spend more time directly engaging with their patients.9 Additionally, AI might provide richer and more specific information about an individual patient’s treatment options and expected outcomes. Such personalized data could allow clinicians to engage their patients more meaningfully in shared decision making. Others, however, worry that the clinician’s role might become obsolete if patients value the increased diagnostic and treatment accuracy offered by AI more than they value human interaction.3 Even if patients still value the humanistic aspects of medical care, some believe these relational needs might soon be met by machines, such as conversational agent systems.3

Even if AI fulfilled its promise of increasing efficiency and treatment personalization, it might not lead to improvements in the patient-clinician relationship. The link between successful implementation of AI in health care and maintaining or improving the patient-clinician relationship relies on several assumptions. In this paper, we will highlight 3 key assumptions (though more may exist) that underlie the optimistic view that AI will improve the healing patient-clinician relationship. If these assumptions are not acknowledged and addressed now, then novel technologies might exacerbate, rather than mitigate, current challenges to these relationships.

Off-loading Tedious Work

American clinicians spend appreciable time analyzing patient data, developing a differential diagnosis, and evaluating potential treatment options.10 Despite this effort, the vast amount of clinical and research data available has long surpassed physicians’ cognitive processing capabilities,11 leading to arduous yet uncomprehensive assessments. Once the clinician eventually reaches the patient’s room, his attention is further divided by tedious charting responsibilities.

Overstretching clinical environmental capacity has been driven by the business model of medicine. 

Future AI technologies will likely decrease the clinician’s tedious charting responsibilities both before, during, and after the patient encounter. Rather than the clinician spending an inordinate amount of time analyzing data related to a patient’s condition, AI could potentially sift through millions of patient-specific data points and provide a differential diagnosis, prognosis, and treatment options both more quickly and more accurately than clinicians. During the clinical visit, voice recognition technology might eliminate manual note entry into the electronic health record.12 Similarly, clinicians might be able to order medications or tests verbally while in conversation with the patient, allowing for fewer peripheral tasks and greater attention to the patient’s needs.

By decreasing arduous work and time spent analyzing data, AI presumably will facilitate improved information exchange and shared decision making between patients and clinicians.9,13 While technical advances might decrease some analytical and administrative demands, AI could also increase the interpersonal demands of patient care. Instead of one or two treatment options to consider for a given disease, AI might offer six or seven possible treatments, along with a wealth of information regarding prognosis and adverse effects. Additionally, many patients might experience an initial distrust of AI, especially since the “black-box” nature of some technologies will make it impossible for the clinician to explain how many recommendations are generated by the algorithm.14 As such, the clinician might spend time explaining and vouching for the AI system’s recommendations to patients. Moreover, an increase in available information necessitates more time to educate the patient, elicit patient values, and come to a shared decision.15 Thus, although many current tasks of clinical care might be off-loaded to an algorithm in the future, the time demand and intentional effort required to provide high-quality clinical care might not decrease and could in fact increase.

Efficiency and Healing

Although time might be recouped from administrative duties by the implementation of AI technologies, structural and personal barriers might hinder clinicians from using this time to further develop their relationship with their patients. For example, the time allotted for each patient visit has remained relatively stable over time, yet the complexity of cases and number of administrative tasks has increased.16 This overstretched clinical environment has been driven, in part, by the business model of medicine. Facilitating longer visits would necessitate either a decrease in the volume of patients seen in clinic or an increase in the number of clinicians hired, both of which would decrease profit margins.17 Accordingly, if AI decreases the time required for a patient visit, the health care system might respond by increasing the volume of patients seen per day rather than allowing time for relationship development and shared decision making. Perhaps administrators might determine that AI-driven efficiency allows clinicians to see 25% more patients per day. Physicians could end up with schedules that are more tightly packed, with less time allotted for each visit.

Even if the clinical load remains stable, personal barriers might prevent some clinicians from engaging with their patients to develop trust and elicit their values regarding goals of care. Highly personal and emotional communication can make some clinicians uncomfortable,18 although one study found that many patients with serious illness prefer their clinicians to provide sensitive, acknowledging, and supportive statements.19 As the second author and colleagues have previously argued, such personal and emotional communication should be viewed as a complex clinician behavior that is influenced by cognitive, social, economic, and cultural factors.20 In Western medical contexts, physicians are often trained to remain emotionally detached in order to maintain scientific and medical objectivity. Some clinicians worry that being fully emotionally present could be characterized as unprofessional or might lead to personal distress.21 Alternatively, other clinicians might not view such value-laden discussions as their responsibility, or they might prioritize other tasks.22 Even if novel technologies provide clinicians with more time and richer data sets, persistent personal barriers can impede the development of healing relationships. As such, future work should aim to address personal and professional barriers that can hinder the development of a trusting and open patient-clinician relationship.

Engaging Patients

If we assume that AI technologies will provide clinicians with more time and richer patient data, and we further assume that clinicians will be highly motivated to engage in relationship building, another critical assumption remains: clinicians will be able to engage meaningfully in these relationship-building activities. We believe that most clinicians genuinely care about their patients and want the best for them. Thus, one might assume that clinicians with additional time and sufficient motivation would translate these intentions into fruitful conversations aimed at better understanding patients’ beliefs and values in order to provide the best individualized care. A limited skill set, however, can trump time and motivation. For example, many clinicians report low confidence in their ability to engage in difficult or emotionally charged conversations as a reason for not engaging in shared decision making.23 Similarly, some clinicians avoid discussing their patient’s psychosocial concerns because they are unsure how to respond.24

Improving clinicians’ communication and social skills will likely require multiple approaches, such as admitting medical students partly on the basis of their social skills and capacity for empathy, early and continued training in communication and relationship building, increased attention to preventing or addressing burnout and moral distress, and opportunities for continued feedback on communication skills. Determining the best approach is an empirical question that is beyond the scope of this paper. However, continued work in this area is needed to maximize the positive benefit of future technologies in health care.

Conclusion

Advanced AI technology has the potential to improve the efficiency and accuracy of medical care, but, as Francis Peabody pointed out in 1927, “The treatment of a disease may be entirely impersonal; the care of a patient must be completely personal.”25 The healing patient-clinician relationship is an essential aspect of health care. Without forethought and planning, the implementation of new technologies might diminish the patient-clinician relationship in the name of efficiency, accuracy, or cost reduction. As such, clinicians, technology developers, administrators, and patient advocates should take steps to maintain the centrality of the healing relationship in medical care as AI technologies are developed and further integrated into the health care system.

References

  1. Mintz Y, Brodie R. Introduction to artificial intelligence in medicine. Minim Invasive Ther Allied Technol. 2019;28(2):73-81.
  2. Hosny A, Parmar C, Quackenbush J, Schwartz LH, Aerts HJWL. Artificial intelligence in radiology. Nat Rev Cancer. 2018;18(8):500-510.
  3. Goldhahn J, Rampton V, Spinas GA. Could artificial intelligence make doctors obsolete? Schweiz Arzteztg. 2019;100(8):242-244.

  4. Zhong QY, Mittal LP, Nathan MD, et al. Use of natural language processing in electronic medical records to identify pregnant women with suicidal behavior: towards a solution to the complex classification problem. Eur J Epidemiol. 2019;34(2):153-162.
  5. Jiang F, Jiang Y, Zhi H, et al. Artificial intelligence in healthcare: past, present and future. Stroke Vasc Neurol. 2017;2(4):230-243.
  6. Epstein RM, Street RL Jr. Patient-Centered Communication in Cancer Care: Promoting Healing and Reducing Suffering. Bethesda, MD: National Cancer Institute; 2007. NIH publication 07-6225. https://cancercontrol.cancer.gov/brp/docs/pcc_monograph.pdf. Accessed March 10, 2020.

  7. Kelley JM, Kraft-Todd G, Schapira L, Kossowsky J, Riess H. The influence of the patient-clinician relationship on healthcare outcomes: asystematic review and meta-analysis of randomized controlled trials. PLoS One. 2014;9(4):e94207.

  8. Committee on Quality of Health Care in America, Institute of Medicine. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academy Press; 2001.

  9. Fogel AL, Kvedar JC. Artificial intelligence powers digital medicine. NPJ Digit Med. 2018;1:5.

  10. Sinsky C, Colligan L, Li L, et al. Allocation of physician time in ambulatory practice: a time and motion study in 4 specialties. Ann Intern Med. 2016;165(11):753-760.
  11. Stead WW, Lin HS, eds; National Research Council. Computational Technology for Effective Health Care: Immediate Steps and Strategic Directions. Washington, DC: National Academies Press; 2009.

  12. Kong X, Ai B, Kong Y, et al. Artificial intelligence: a key to relieve China’s insufficient and unequally-distributed medical resources. Am J Transl Res. 2019;11(5):2632-2640.
  13. Krittanawong C. The rise of artificial intelligence and the uncertain future for physicians. Eur J Intern Med. 2018;48:e13-e14.

  14. London AJ. Artificial intelligence and black-box medical decisions: accuracy versus explainability. Hastings Cent Rep. 2019;49(1):15-21.
  15. Karches KE. Against the iDoctor: why artificial intelligence should not replace physician judgment. Theor Med Bioeth. 2018;39(2):91-110.
  16. Linzer M, Bitton A, Tu SP, Plews-Ogan M, Horowitz KR, Schwartz MD; Association of Chiefs and Leaders in General Internal Medicine (ACLGIM) Writing Group. The end of the 15-20 minute primary care visit. J Gen Intern Med. 2015;30(11):1584-1586.

  17. Branning G, Vater M. Healthcare spending: plenty of blame to go around. Am Health Drug Benefits. 2016;9(8):445-447.
  18. Kerasidou A, Horn R. Making space for empathy: supporting doctors in the emotional labour of clinical care. BMC Med Ethics. 2016;17:8.

  19. Visser LNC, Schepers S, Tollenaar MS, de Haes HCJM, Smets EMA. Patients’ and oncologists’ views on how oncologists may best address patients’ emotions during consultations: an interview study. Patient Educ Couns. 2018;101(7):1223-1231.
  20. Sisk BA, Mack JW, DuBois J. Knowing versus doing: the value of behavioral change models for emotional communication in oncology. Patient Educ Couns. 2019;102(12):2344-2348.
  21. Welp A, Meier LL, Manser T. Emotional exhaustion and workload predict clinician-rated and objective patient safety. Front Psychol. 2015;5:1573.

  22. Forsey M, Salmon P, Eden T, Young B. Comparing doctors’ and nurses’ accounts of how they provide emotional care for parents of children with acute lymphoblastic leukaemia. Psychooncology. 2013;22(2):260-267.
  23. Gravel K, Légaré F, Graham ID. Barriers and facilitators to implementing shared decision-making in clinical practice: a systematic review of health professionals’ perceptions. Implement Sci. 2006;1:16.

  24. Fagerlind H, Kettis Å, Glimelius B, Ring L. Barriers against psychosocial communication: oncologists’ perceptions. J Clin Oncol. 2013;31(30):3815-3822.
  25. Peabody FW. The care of the patient. JAMA. 1927;88(12):877-882.

Citation

AMA J Ethics. 2020;22(5):E395-400.

DOI

10.1001/amajethics.2020.395.

Conflict of Interest Disclosure

The author(s) had no conflicts of interest to disclose.

The viewpoints expressed in this article are those of the author(s) and do not necessarily reflect the views and policies of the AMA.