top of page

Rise of the Robot Therapists (part 2)

This is the second in a series, the first post coming from Dec. 2017 (click here for 2017 post)

More than three years following my initial post about Artificial Intelligence (AI) therapists, technology has developed at a shocking pace. Will this make therapy-by-robot a sound option? The suggestion is that a "client" could talk to an app on the phone or through an in-home device like Alexa. In my initial post, I pointed out some flaws that made me doubt the effectiveness of this method, and a client who tried it and found the most popular and advanced app ("Woebot") to be highly scripted and unhelpful, offering irrelevant or obvious advice and misunderstanding the client just like the very worst type of therapist. However, this is 2021, and AI technology has move forward light years.

If you only have a minute, review the article itself--there is a hyperlink at the bottom of this page. If you have a few minutes (my blog creator tells me that this entry is a 3-minute read), I will sum up my thoughts.

The primary issue with AI therapy is relationship. My own personal research with online therapy with a real therapist and real client has demonstrated that there is a very real problem with trust here. Clients who have experienced in-office therapy, no matter if they preferred online therapy, still rated it as "very important" that they meet the therapist in person first. This suggests the possibility that even when speaking to a human being online, a client, without realizing it, may not be able to trust the therapist to a great enough extent to enable effective therapy. I would assume that the same or worse would be true when speaking to a robot. To sum this up with a personal story, an American may think the local Mexican restaurant is the very best food around. But then he goes to Mexico on his honeymoon, eats wonderful food, and wonders why he ever liked the local Mexican restaurant. Clients without other experience may not perceive the deficiencies of online or AI therapy, and how much better and more effective in-person therapy would be, which is tragic.

The 2021 technology is definitely more advanced today, but only with Cognitive Behavioral Therapy (CBT), a very structured system of therapy based on here-and-now interventions. Other therapies, often more effective and targeted to specific client concerns, rely on a thorough history taken from the client, asking the question, "What has happened to you?" rather than "What is wrong with you?" In fact, good practice of CBT itself starts with a good history.

AI is not yet capable of grasping the complexities of client history, unable to help clients refine their therapy goals and understand their presenting problem--getting to the root of the issue. For instance, "I'm anxious" often means "I had a very hard childhood, and it affects my daily life." AI still offers canned solutions tied to a self-reported diagnosis by the client. But so do a lot of therapists. In my experience, you might as well go to a random person off the street than most therapists. And the results are typical of simply talking to someone about your problems. The article shows a graph of how the typical measure of depression, the PHQ-9 evaluation, is affected by Woebot: two weeks of treatment reduced the score from PHQ-9 from 14 to 12. This translates into "Moderate" depression remaining "Moderate" depression--just a bit less so. I believe that talking to most any therapist would have the same effect, as many depressed people simply have no one to talk to about their concerns.

These are important concerns, but another concern is privacy. Everyone now knows that it is a simple matter for an Amazon or Google employee, or any hacker, to listen through any Alexa-type device at any time. More worrisome is the ability of hackers to listen through your phone or computer even when it is turned off. The only protection is to disable internet access, but to use AI, the device must be on.

There are some noteworthy points made in the article, such as that due to stigma, a person with a mental concern may not go to a therapist ever, but would be willing to see a robot therapist. As well, therapy by AI can be offered at a vastly reduced cost--or even without cost. But as I said, with therapy you don't always get what you pay for. Make sure you evaluate your therapist's credentials, and if you go, make sure they give you a game plan based on research-validated therapy. Always ask your therapist whether they have ever been in therapy for an extended period, if only to improve their skill and empathy. If the answer is no, walk out. Robots and therapists who think they "have it together" are going to be pretty ignorant.

Link to full article (click below or cut-and-paste):

bottom of page