Can human empathy exist without a human? In the medical field, innovators are determined to find out. Within the last few months, a new mental health app has entered into the medical self-help arena to support those suffering from common mental health conditions such as depression and anxiety. Engineered with empathy in mind and programmed to apply principles of cognitive behavior therapy without the input of a human psychologist, Woebot is meant to serve as a trusted (virtual) companion through the Facebook Messenger app. While originally designed for college students, the chatbot has since been expanded to support adults of all ages through trying times as a readily-available conversational companion. Through the excitement, however, medical professionals and patients alike are left to wonder – will empathy delivered through an automated program have the same therapeutic effect as the care provided by a human doctor?

 

It’s certainly an interesting question to ask, given the current discourse in the medical field about the role empathy should have in doctor-patient interactions. As I’ve written before, dispassionate treatment is often seen as more appropriate than empathetic care. In fact, according to a review published in Academic Medicine in 2011, empathy tends to decline in medical school years as doctors strive towards a dispassionate professionalism, suggesting that detachment is a trained, rather than intuitive, state for doctors. Despite the stigma of unprofessionalism, however, numerous studies have found that empathy actually improves patient satisfaction and even – in one 2011 study concerning diabetics – patient outcomes.

 

So where does this need for empathy leave us in the mental health field, wherein trust and connection between patient and caregiver is vital for treatment? Let’s return to the basics for a moment, and define the term within a medical context. Clinical empathy, as described by an article published in the British Journal of General Practice is: “an ability to: (a) understand the patient’s situation, perspective, and feelings (and their attached meanings); (b) to communicate that understanding and check its accuracy; and (c) to act on that understanding with the patient in a helpful (therapeutic) way. ” With this in mind, the modified question left to us as patients and medical professions remains: Can a program enact clinical empathy effectively without use of a human practitioner?

 

With this question in mind, we should return to our examination of Woebot. As a conversational partner, Woebot is meant to be charming and funny, but never intends to fool its human user into thinking of it as a person. It weaves reminders of its AI nature into its jokes and commentary, and is intended to function more as a receptive sounding board than psychologist. As Woebot Labs CEO Alison Darcy commented for MobiHealthNews, “[Woebot] helps people think, and they can start to learn more about how they function in the world with these emotions and thoughts.” In other words, Woebot is a mental health aide, rather than a venue for therapeutic care. Isolated as it is from any human professionals, the best Woebot can do when faced with an at-risk user is to offer the number to a hotline: a direct link to human empathy.

 

Machines such as Woebot can mimic human empathy and offer conversational companionship, but I think that clinical empathy remains firmly under the purview of human doctors. At its core, Woebot is a tool – a sounding board. It applies principles of cognitive behavior therapy to encourage users to verbalize their thoughts and understand their feelings, and is effective in its efforts to do so. However, it cannot take the place of a full-fledged therapist who has the capability to pick up the nuances of subtext and body language and “act on their understanding with the patient.” Given this, clinical empathy is beyond the reach of automation and in the hands of doctors. Moreover, I believe that the entry of AI into empathy calls for us to reconsider our conceptions of “professionally” dispassionate care in human medicine. This pivot towards automated empathy displays patients’ human need for understanding and connection in their care. Knowing this, I wonder if we aren’t in some way missing the mark by continuing to prize “dispassionate” professionalism over emotional connection in medicine. Perhaps programs can deliver human empathy without the human – but I would argue that in care, clinical empathy always places human doctors as the most effective and connective sources of care.