Skip to main content

Can we replace human empathy in healthcare?

In a May 2021 paper published in the journal AI & Society, clinical empathy expert and Berkeley Public Health bioethics professor Jodi Halpern, MD, PhD, posits that artificial intelligence (AI) cannot replace human empathy in the healthcare setting and that empathy is key to the successful treatment of patients.

The use of AI has shown promising results in clinical medicine, including in research and treatment. However, in her paper “In principle obstacles for empathic AI: why we can’t replace human empathy in healthcare,” Halpern—whose 2011 book From Detached Concern to Empathy: Humanizing Medical Practice was called “seminal” by JAMA—outlines the obstacles to the application of AI in clinical medicine and care where empathy is important. She concludes that these problems cannot be solved with any of the technical and theoretical approaches to AI currently in use.

Clinical empathy “involves attuning to another person’s emotional meanings and trying to imagine the situation from their perspective,” says Halpern. “The attitude necessary for this is genuine curiosity to hear more about another’s perspective based on recognizing that you don’t know another person’s world or what matters most to them.”

This type of human connection is vital to clinical care, Halpern says, and improves outcomes in the following three ways: “First, outcomes depend on correct diagnosis. Getting a good history is essential for correct diagnosis,” Halpern says. “Videotaped doctor-patient interactions, in studies replicated internationally. show that patients disclose much more information when doctors are demonstrably empathic.

“Second, outcomes depend on patients adhering to treatment.  More than half of prescriptions are not followed for a variety of reasons. The biggest predictor of adherence to treatment is trust in your physician. Empathic communication by the physician is a big predictor of trust.

“Third, when there is a serious diagnosis, patients cope better with the ‘bad news’ when it is delivered in an empathic context.”

In her paper, Halpern concludes that empathic AI is either impossible or unethical; impossible because of the lack of genuine empathy on the part of the AI and unethical because using AI in these situations can erode the expectations that human beings in distress deserve real, human empathy.

Halpern says she’d like to see AI used “for all the aspects of medicine it can contribute to, but not to use it to replace primary doctor-patient relationships as sources of therapeutic empathy.”


People of UCBPH found in this article include:

Can we replace human empathy in healthcare? © 2021 by UC Berkeley School of Public Health is licensed under CC BY-NC-ND 4.0 Creative Commons Credit must be given to the creator Only noncommercial use is permitted No derivatives or adaptations are permitted
  • What is CC BY-NC-ND 4.0?

    CC BY-NC-ND 4.0

    Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International

    You are free to:
    • Share — copy and redistribute the material in any medium or format
    • The licensor cannot revoke these freedoms as long as you follow the license terms.
    Under the following terms:
    • BY Attribution — You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
    • NC NonCommercial — You may not use the material for commercial purposes.
    • ND NoDerivatives — If you remix, transform, or build upon the material, you may not distribute the modified material.
    • No additional restrictions — You may not apply legal terms or technological measures that legally restrict others from doing anything the license permits.
    Learn more: