Policy Forum
Feb 2019

What Are Important Ethical Implications of Using Facial Recognition Technology in Health Care?

Nicole Martinez-Martin, JD, PhD
AMA J Ethics. 2019;21(2):E180-187. doi: 10.1001/amajethics.2019.180.

Abstract

Applications of facial recognition technology (FRT) in health care settings have been developed to identify and monitor patients as well as to diagnose genetic, medical, and behavioral conditions. The use of FRT in health care suggests the importance of informed consent, data input and analysis quality, effective communication about incidental findings, and potential influence on patient-clinician relationships. Privacy and data protection are thought to present challenges for the use of FRT for health applications.

Promises and Challenges of Facial Recognition Technology

Facial recognition technology (FRT) utilizes software to map a person’s facial characteristics and then store the data as a face template.1 Algorithms or machine learning techniques are applied to a database to compare facial images or to find patterns in facial features for verification or authentication purposes.2 FRT is attractive for a variety of health care applications, such as diagnosing genetic disorders, monitoring patients, and providing health indicator information (related to behavior, aging, longevity, or pain experience, for example).3-5

FRT is likely to become a useful tool for diagnosing many medical and genetic conditions.6,7 Machine learning techniques, in which a computer program is trained on a large data set to recognize patterns and generates its own algorithms on the basis of learning,8 have already been used to assist in diagnosing a patient with a rare genetic disorder that had not been identified after years of clinical effort.9 Machine learning can also detect more subtle correlations between facial morphology and genetic disorders than clinicians.4 It is thought that FRT can therefore eventually be used to assist in earlier detection and treatment of genetic disorders,10,11 and computer applications (commonly known as apps) such as Face2Gene have been developed to assist clinicians in diagnosing genetic disorders.12

FRT has other potential health care applications. FRT is being developed to predict health characteristics, such as longevity and aging.13 FRT is also being applied to predict behavior, pain, and emotions by identifying facial expressions associated with depression or pain, for example.14,15 Another major area for FRT applications in health care is patient identification and monitoring, such as monitoring elderly patients for safety or attempts to leave a health care facility16 or monitoring medication adherence through the use of sensors and facial recognition to confirm when patients take their medications.17

As with any new health technology, careful attention should be paid to the accuracy and validity of FRT used in health care applications as well as to informed consent and reporting incidental findings to patients. FRT in health care also raises ethical questions about privacy and data protection, potential bias in the data or analysis, and potential negative implications for the therapeutic alliance in patient-clinician relationships.

Ethical Dimensions of FRT in Health Care

Informed consent. FRT tools that assist with identification, monitoring, and diagnosis are expected to play a prominent role in the future of health care.6,18 Some applications have already been implemented.13,19 As FRT is increasingly utilized in health care settings, informed consent will need to be obtained not only for collecting and storing patients’ images but also for the specific purposes for which those images might be analyzed by FRT systems.20 In particular, patients might not be aware that their images could be used to generate additionally clinically relevant information. While FRT systems in health care can de-identify data, some experts are skeptical that such data can be truly anonymized21; from clinical and ethical perspectives, informing patients about this kind of risk is critical.

Some machine learning systems need continuous data input to train and improve the algorithms22 in a process that could be analogized to quality improvement research, for which informed consent is not regarded as necessary.23 For example, to improve its algorithms, FRT for genetic diagnosis would need to receive new data sets of images of patients already known to have specific genetic disorders.2 To maintain trust and transparency with patients, organizations should consider involving relevant community stakeholders in implementing FRT and in decisions about establishing and improving practices of informing patients about the organization’s use of FRT. As FRT becomes capable of detecting a wider range of health conditions, such as behavioral24 or developmental disorders,25 health care organizations and software developers will need to decide which types of analyses should be included in a FRT system and the conditions under which patients might need to be informed of incidental findings.

Bias. As with any clinical innovation, FRT tools should be expected to demonstrate accuracy for specific uses and to demonstrate that overall benefits outweigh risks.26 Detecting and evaluating bias in data and results should also receive close ethical scrutiny.27 In machine learning, the quality of the results reflects the quality of data input to the system28—an issue sometimes referred to as “garbage in, garbage out.” For example, when images used to train software are not drawn from a pool that is sufficiently racially diverse, the system may produce racially biased results.29 If this happens, FRT diagnostics might not work as well for some racial or ethnic groups as others. One recent example that gained notoriety was an FRT system used to identify gay men from a set of photos that may have simply identified the kind of grooming and dress habits stereotypically associated with gay men.30 The developers of this FRT system did not intend it to be used for a clinical purpose but rather to illustrate how bias can influence FRT findings.30

Thankfully, potential solutions for addressing bias in FRT systems exist. These include efforts to create AI systems that explain the rationale behind the results generated.31 Clinicians can also be trained to consider and respond to limitations and biases of FRT systems.32 In addition, organizations such as the National Human Genome Research Institute have sought to diversify the range of people whose images are included in their image databases.33

Patient privacy. FRT raises novel challenges regarding privacy. FRT systems can store data as a complete facial image or as a facial template.34 Facial templates are considered biometric data and thus personally identifiable information.35 The idea that a photo can reveal private health information is relatively new, and privacy regulations and practices are still catching up. A few states, such as Illinois, have regulations that limit uses for which consumer biometric data can be collected.36 The Health Insurance Portability and Accountability Act (HIPAA) governs handling of patients’ health records and personal health information and includes privacy protections for personally identifiable information. More specifically, it protects the privacy of biometric data, including “full-face photographs and any comparable images,” which are “directly related to an individual.”37 Thus, facial images used for FRT health applications would be protected by HIPAA.38 Entities covered by HIPAA, including health care organizations, clinicians, and third-party business associates, would need to comply with HIPAA regulations regarding the use and disclosure of protected health information.38 However, clinicians should advise patients that there may be limited protections for storing and sharing data when using a consumer FRT tool.

Some statutes that protect health information might not apply to FRT. The Genetic Information Nondiscrimination Act (GINA) of 2008, for example, does not apply to FRT for genetic diagnosis, as FRT does not fit GINA’s definition of genetic testing or genetic information.39 The Americans with Disabilities Act of 1990, which protects people with disabilities from discrimination in public life (eg, schools or employment),40 would also likely not apply to FRT used for diagnostic purposes if the conditions diagnosed are currently unexpressed. Employers might also be interested in using FRT tools to predict mood or behavior as well as to predict longevity, particularly for use in wellness programs to lower employers’ health care costs.

Broader influence of FRT. There will need to be careful thought and study of the broader impact of FRT in health care settings. One potential issue is that of liability. For example, if FRT diagnostic software develops to the point that it is used not just to augment but to replace a physician’s judgment, ethical and legal questions may arise regarding which entity appropriately has liability.41 Or if FRT is used to monitor compliance, track patients’ whereabouts, or assist in other kinds of surveillance, patients’ trust in physicians could be eroded, undermining the therapeutic alliance. It is therefore important to weigh the relative benefits and burdens of specific FRT uses in health care and to conduct research into how patients perceive its use. On the one hand, the use of FRT to monitor the safety of dementia patients could be perceived as having benefits that outweigh the burdens of surveillance. On the other, FRT medication adherence monitoring might not be sufficiently effective in improving adherence to outweigh the risk of undermining trust in the patient-physician relationship.42

As considered here, numerous applications of FRT in health care settings suggest the ethical, clinical, and legal importance of informed consent, data input and analysis quality, effective communication about incidental findings, and potential influence on patient-clinician relationships. Privacy and data protections are key to advancing FRT and making it helpful.

References

  1. Gates KA. Our Biometric Future: Facial Recognition Technology and the Culture of Surveillance. New York, NY: New York University Press; 2011.

  2. Parmar DN, Mehta BB. Face recognition methods and applications. arXiv. http://arxiv.org/abs/1403.0485. Published March 3, 2014. Accessed July 24, 2018.

  3. Stephen ID, Hiew V, Coetzee V, Tiddeman BP, Perrett DI. Facial shape analysis identifies valid cues to aspects of physiological health in Caucasian, Asian, and African populations. Front Psychol. 2017;8:1883.

  4. Chen S, Pan ZX, Zhu HJ, et al. Development of a computer-aided tool for the pattern recognition of facial features in diagnosing Turner syndrome: comparison of diagnostic accuracy with clinical workers. Sci Rep. 2018;8(1):9317.

  5. Hossain MS, Muhammad G. Cloud-assisted speech and face recognition framework for health monitoring. Mob Netw Appl. 2015;20(3):391-399.
  6. Sandolu A. Why facial recognition is the future of diagnostics. Medical News Today. December 9, 2017. https://www.medicalnewstoday.com/articles/320316.php. Accessed May 24, 2018.

  7. Loos HS, Wieczorek D, Würtz, RP, von der Malsburg C, Horsthemke B. Computer-based recognition of dysmorphic faces. Eur J Hum Gene. 2003;11(8):555-560.
  8. Darcy AM, Louie AK, Roberts LW. Machine learning and the profession of medicine. JAMA. 2016;315(6):551-552.
  9. Molteni M. Thanks to AI, computers can now see your health problems. Wired. January 9, 2017. https://www.wired.com/2017/01/computers-can-tell-glance-youve-got-genetic-disorders/. Accessed April 8, 2017.

  10. Kosilek RP, Schopohl J, Grunke M, et al. Automatic face classification of Cushing’s syndrome in women—a novel screening approach. Exp Clin Endocrinol Diabetes. 2013;121(9):561-564.
  11. Schneider HJ, Kosilek RP, Günther M, et al. A novel approach to the detection of acromegaly: accuracy of diagnosis by automatic face classification. J Clin Endocrinol Metab. 2011;96(7):2074-2080.
  12. Basel-Vanagaite L, Wolf L, Orin M, et al. Recognition of the Cornelia de Lange syndrome phenotype with facial dysmorphology novel analysis. Clin Genet. 2016;89(5):557-563.
  13. Mack H. FDNA launches app-based tool for clinicians using facial recognition, AI and genetic big data to improve rare disease diagnosis and treatment. MobiHealthNews. March 21, 2017. http://www.mobihealthnews.com/content/fdna-launches-app-based-tool-clinicians-using-facial-recognition-ai-and-genetic-big-data. Accessed March 28, 2017.

  14. Bahrampour T. Can your face reveal how long you’ll live? New technology may provide the answer. Washington Post. July 2, 2014. https://www.washingtonpost.com/national/health-science/can-your-face-reveal-how-long-youll-live-new-technology-may-provide-the-answer/2014/07/02/640bacb4-f748-11e3-a606-946fd632f9f1_story.html. Accessed July 25, 2018.

  15. Shakya S, Sharma S, Basnet A. Human behavior prediction using facial expression analysis. In: Proceedings of the 2016 International Conference on Computing, Communication and Automation (ICCCA); April 29-30, 2016; Noida, India:399-404.

  16. Bina RW, Langevin JP. Closed loop deep brain stimulation for PTSD, addiction, and disorders of affective facial interpretation: review and discussion of potential biomarkers and stimulation paradigms. Front Neurosci. 2018;12:300.

  17. Hossain MS, Muhammad G. Cloud-assisted framework for health monitoring. In: Proceedings of the 2015 IEEE 28th Canadian Conference on Electrical and Computer Engineering (CCECE); May 3-6, 2015; Halifax, Canada:1199-1202.

  18. Baum S. Using facial recognition and AI to confirm medication adherence, AiCure raises $12.25M. MedCity News. January 12, 2016. https://medcitynews.com/2016/01/aicure-fundraise/. Accessed July 25, 2018.

  19. Wicklund E. EHR provider touts mHealth access by Apple’s facial recognition app. mHealthIntelligence. https://mhealthintelligence.com/news/ehr-provider-touts-mhealth-access-by-apples-facial-recognition-app. Published November 9, 2017. Accessed October 26, 2018.

  20. Balthazar P, Harri P, Prater A, Safdar NM. Protecting your patients’ interests in the era of big data, artificial intelligence, and predictive analytics. J Am Coll Radiol. 2018;15(3, pt B):590-586.
  21. Mohapatra S. Use of facial recognition technology for medical purposes: balancing privacy with innovation. Pepperdine Law Rev. 2016;43(4):1017-1064.
  22. Watson M. Keeping your machine learning models up-to-date: continuous learning with IBM Watson machine learning (part 1). Data Lab. March 2018. https://medium.com/ibm-watson-data-lab/keeping-your-machine-learning-models-up-to-date-f1ead546591b. Accessed October 26, 2018.

  23. Cohen IG, Amarasingham R, Shah A, Xie B, Lo B. The legal and ethical concerns that arise from using complex predictive analytics in health care. Health Aff (Millwood). 2014;33(7):1139-1147.
  24. Wen L, Li X, Guo G, Zhu Y. Automated depression diagnosis based on facial dynamic analysis and sparse coding. IEEE Trans Inf Forensics Secu. 2015;10(7):1432-1441.
  25. Borsos Z, Gyori M. Can automated facial expression analysis show differences between autism and typical functioning? Stud Health Technol Inform. 2017;242:797-804.

  26. Ghaemi SN, Goodwin FK. The ethics of clinical innovation in psychopharmacology: challenging traditional bioethics. Philos Ethics Humanit Med. 2007;2(1):26.

  27. Tunkelang D. Ten things everyone should know about machine learning. Forbes. September 6, 2017. https://www.forbes.com/sites/quora/2017/09/06/ten-things-everyone-should-know-about-machine-learning. Accessed January 13, 2018.

  28. McCullom R. Facial recognition technology is both biased and understudied. Undark. May 17, 2017. https://undark.org/article/facial-recognition-technology-biased-understudied/. Accessed July 31, 2018.

  29. Researchers flag up facial recognition racial bias [news]. Biom Technol Today. 2016;2016(5):2-3.

  30. Agüera y Arcas B, Todorov A, Mitchell M. Do algorithms reveal sexual orientation or just expose our stereotypes? Medium. January 11, 2018. https://medium.com/@blaisea/do-algorithms-reveal-sexual-orientation-or-just-expose-our-stereotypes-d998fafdf477. Accessed July 17, 2018.

  31. Knight W. There’s a big problem with AI: even its creators can’t explain how it works. MIT Technology Review. April 11, 2017. https://www.technologyreview.com/s/604087/the-dark-secret-at-the-heart-of-ai/. Accessed March 12, 2018.

  32. Char DS, Shah NH, Magnus D. Implementing machine learning in health care—addressing ethical challenges. New Engl J Med. 2018;378(11):981-983.
  33. National Human Genome Research Institute, National Institutes of Health. Atlas of human malformation syndromes in diverse populations. https://research.nhgri.nih.gov/atlas/. Updated October 19, 2016. Accessed March 28, 2017.

  34. Mazura JC, Juluru K, Chen JJ, Morgan TA, John M, Siegel EL. Facial recognition software success rates for the identification of 3D surface reconstructed facial images: implications for patient privacy and security. J Digit Imaging. 2012;25(3):347-351.
  35. Brostoff G. 3D facial recognition gives healthcare data a new look. Clinical Informatics News. October 6, 2017. http://www.clinicalinformaticsnews.com/2017/10/06/3d-facial-recognition-gives-healthcare-data-a-new-look.aspx. Accessed July 17, 2018.

  36. Hughes N. Google takes aim at controversial, stringent Illinois biometric privacy law. One World Identity. https://oneworldidentity.com/google-takes-aim-controversial-stringent-illinois-biometric-privacy-law/. Published April 25, 2018. Accessed August 1, 2018.

  37. US Department of Health and Human Services. Guidance regarding methods for de-identification of protected health information in accordance with the Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule. https://www.hhs.gov/hipaa/for-professionals/privacy/special-topics/de-identification/index.html. Accessed December 10, 2018.

  38. Health Insurance Portability and Accountability Act of 1996, Pub L No. 104-191, 110 Stat 1936.

  39. Genetic Information Nondiscrimination Act of 2008, Pub L No. 110-233, 122 Stat 881.

  40. Americans with Disabilities Act of 1990, Pub L No. 101-336, 104 Stat 327.

  41. Char DS, Shah NH, Magnus D. Implementing machine learning in health care—addressing ethical challenges. N Engl J Med. 2018;378(11):981-983.
  42. Martinez-Martin N, Char D. Surveillance and digital health. Am J Bioeth. 2018;18(9):67-68.

Citation

AMA J Ethics. 2019;21(2):E180-187.

DOI

10.1001/amajethics.2019.180.

Conflict of Interest Disclosure

The author(s) had no conflicts of interest to disclose. 

The viewpoints expressed in this article are those of the author(s) and do not necessarily reflect the views and policies of the AMA.