/cdn.vox-cdn.com/uploads/chorus_image/image/56093221/woebot.0.0.jpeg)
When Alison Darcy began developing a mental-health-focused chatbot for Facebook last summer, she made a point to give it a robotic name — nothing even vaguely human, like Siri or Alexa. She went with something charmingly mechanical: Woebot. Her service uses lessons from a short-term, goal-oriented psychotherapy method known as cognitive behavioral treatment to help people track their moods, and the name clarified that it was by no means a human replacement. It also showed off what Darcy saw as her program’s greatest strength.
"This is not a person that is ever going to judge me," the clinical research psychologist and former Stanford instructor said. "This is not a person that I’m going to burden by contacting at 2 a.m."
Nearly a year later, as her subscription-based chatbot becomes available to Facebook’s near 2 billion users, Darcy has warmed to the idea of assigning Woebot human qualities, especially after noticing that some of her customers have professed their love to it.
"Woebot’s personality is modeled after Kermit the Frog," she said. "When you watch Kermit the Frog being interviewed, you’re not thinking, ‘Well, that’s ridiculous, it’s a puppet.’ You just go with it. You’re happy with that suspended disbelief. That’s kind of the feeling that Woebot has, too. People understand that he has limitations."
:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/8932491/0_e2ITOzGVVvA1bObw..0.jpeg)
Woebot, which made its Facebook Messenger debut on Tuesday and will cost $9 a week for a monthly subscription, is the latest addition to the growing collection of mental health apps that have earned footing amid this past decade’s wellness boom — the Great Goop Awakening, if you will. Though each is varied in its approach and content, their overall purpose is to streamline counseling, pushing it beyond the proverbial couch. These inventions began with ELIZA, a language-processing program developed in 1966 to mimic the vague responses of a psychotherapist, and have since evolved to a full-blown industry. Around this time last year, the startup Talkspace secured $15 million to grow an app that virtually connects patients with real-life therapists. Now, as Silicon Valley places bets on a future full of driverless cars and autonomous indoor farms, more and more companies are looking into a mental health market that eliminates humans altogether. Alongside apps like iCouch, PTSD Coach, and Joy, Woebot is paving the way for a new category of automated self-care, where an algorithm, rather than a human, helps people cope.
The demand is there, especially among a younger, digitally savvy generation. In the past decade, college counseling centers and mental health clinics have reported a growing number of students seeking help for anxiety and depression. Whether you chalk that up to increased coursework, shifting societal pressures, or the never-ending apocalyptic news cycle, it means more people are looking for affordable ways to work out their issues. In some cases, time, location, and cost can prevent patients from seeking assistance. And the mental health industry’s general reluctance to embrace nontraditional forms of treatment, or understand online culture, has also caused a divide between digital natives and their therapists. With Woebot, Darcy is hoping to fill a need that more traditional treatments can’t.
"We have to be able to outsource some of the tasks that therapists have to other things that are more scalable," she said. "It comes down to meeting people where they’re at, and I think we’re giving people who wouldn’t otherwise ever see a clinician a really good first experience of what therapy would look like."
It’s a new territory for the profession. And in the eyes of Laura Lansrud-López, a clinical mental health counselor based in New Mexico who has embraced digital culture in her practice, this kind of bot-based therapy can be helpful — to a limit. Apps that help log moods and behaviors are simply digitizing a task that many therapists ask their patients to complete. But she also warns that they can be insufficient for those with serious mental health issues.
"In the old days we would assign clients to use pencil and paper to track these things between sessions and bring in their worksheets, but now clients can use cell phones," she said in an email. "Hallelujah for technology! That being said, I would not recommend a client with a diagnosable mental health disorder such as PTSD, Major Depressive Disorder, or OCD use an app in place of professional psychotherapy."
For now, Darcy is confident that her bot — which she and her team designed to strike a balance between educational and afffable — may do the trick. Rather than rely on some of the lazier iterations of automated Facebook Messenger personalities out there (I’m talking to you, Maroon 5), Woebot guides the conversation via a decision tree. It works like this: After the bot says something, the user is offered a series of two or three canned responses in the form of bright blue buttons, something that Darcy says its trial participants enjoyed because it made the process easier via smartphone. In some cases the user is able to type into the box to share his or her feelings, which the bot uses natural language processing to analyze. Eventually, using basic AI tools, Woebot will personalize its responses.
Alongside Woebot’s debut, Darcy is also publishing a study performed at Stanford’s Department of Psychiatry and Behavioral Sciences that speaks to the effectiveness of her app. In a comparative trial, a group of 70 students were split in half. One group was given an ebook by the National Institute of Mental Health titled Depression and College Students, and the others were set up with Woebot. At the end of the two-week trial, researchers found that those in the latter group "significantly reduced their symptoms of depression." It seems obvious that when tested alongside a static PDF, Woebot emerged victorious. And the fact that students found the app helpful for the two weeks they were asked to use it doesn’t indicate how effective it might be in the long term. Nevertheless, Darcy was encouraged by the results.
"Some of the people were saying how much they had noticed their thinking had changed," she said. "Those kind of things you really hope for as a clinician, but it’s really rare to have a response that fast. We’re just really excited about the potential."
Even if Woebot is accessible, it comes with a laundry list of complications, primarily concerning patient safety and security. The Woebot team has designed its product so that anyone who is behind the scenes looking at a patient’s data or conversations cannot view their identity, but because the bot is hosted on Facebook it doesn’t meet the requirements of the Health Insurance Portability and Accountability Act, a law designed to protect patients’ medical records. Darcy says her company is working quickly to create standalone iOS and Android bots that conform to the law. But according to Lauren Hazzouri, a Pennsylvania-based psychologist who specializes in media psychology, the potential that Facebook may be mining a private conversation for its own profit inherently compromises the therapeutic process.
"It seems to me that to even call this therapy is a mistake," Hazzouri said via email. "It is not. Without the therapeutic relationship, teaching [cognitive behavioral treatment] skill sets becomes psycho-education, not therapy."
When it comes to protecting the safety of a patient in crisis, Woebot is programmed to pass along hotline information and resources to clients who are not improving from its treatment. But Hazzouri argues that preventing and handling those critical moments for a patient takes much more nuance and interpersonal connection than a bot can understand.
"As a psychologist, I recognize the importance of tailoring treatment to the individual and using human intuition and years of experience to hear what the patient is saying and also what the patient is not saying," she said. "All in all, I’m not worried about job security. Let’s leave it at that."