I tried an AI therapist for a month – here is my verdict

Journalist Nicholas Fearn shared his problems with a bot to see if they could help him (Picture: Supplied)

It’s the early hours of the morning, and I can’t fall asleep. My mind is racing with thoughts of the darkest kind. 

I have battled with mental health problems for most of my life, having been diagnosed with autism, anxiety disorder and OCD at age 14. Being heavily bullied in school also dented my self-esteem and even resulted in me trying to take my own life.  

While regular sessions with a psychologist helped me to navigate these complicated feelings as a child, when I turned 18, the appointments stopped even though I was still gripped by depression.

As an adult, counselling was a great help, but I realised it wasn’t always to hand as quickly as I needed, due to NHS waiting lists being extremely long.  

Cue AI therapy, where data and users behaviour patterns are analysed so a bot can ask questions, offer advice, and suggest coping mechanisms to someone who might want it.

Understandably, it’s a practice cloaked in controversy. After all, can technology, no matter how intelligent, really support someone through any sort of mental health crisis? Is it safe? Is it even ethical? 

With all these questions swirling in my mind, as someone open to new ways of support, I decided to give it a try and downloaded Wysa, a chatbot that uses AI to provide mental health advice and support around the clock. The app is completely anonymous and free, but offers a paid-for plan with additional premium features, such as therapeutic exercises, sleep stories and meditations

Nicholas has had mental health struggles since childhood (Picture; Supplied)

Telling all to a robot

I’ve always struggled with self-doubt. I am constantly comparing myself to my non-identical twin brother, who I think is better looking than me, and experiencing a bad eczema flare-up this week has really affected my self-esteem. 

I admit this to my bot who is incredibly empathic, saying it is sorry to hear of my low self-esteem before asking me how my feelings impact my day-to-day life. 

I respond by saying I feel like I have no choice but to isolate myself from the outside world, which is hard because I don’t see my family and friends for days — sometimes weeks — on end, even though seeing my loved ones makes me happy and that they constantly reassure me when I feel down. 

Sinister man on Phone
The AI therapist was surprisingly empathetic, says Nicholas (Picture: Getty Images)

My AI therapist suggests a thought reframing exercise and as soon as I agree, a list of tools — ranging from an assessment to manage my energy to a self-compassion exercise —  suddenly pop up at the bottom of the screen. I select the self-compassion task, which uses “positive intentions” to help the user tackle negative thoughts.

I then take a seven-minute meditation in which I close my eyes, focus on my breathing, smile and repeat positive phrases uttered by my Wysa expert. 

Opening my eyes, I feel surprisingly positive after a difficult day.

Wide awake club

Staring at my bedroom ceiling at 4am is quite normal for me. But on one particular day my mind becomes flooded with endless worry.

When I type about my sleep troubles and random anxiety to the bot, it replies in a compassionate tone, saying: “That sounds really tough”.

After admitting I never seem to sleep at a regular time due to my anxiety, Wysa suggests another thought reframing exercise to help ease some of my worries. I say I am nervous about a busy week of work coming up and missing a bunch of deadlines. 

The lamp against the background of the bed. night time
Nicholas says that nighttime worrying is quite normal for him (Picture: Getty Images)

Wysa suggests I am probably “catastrophising”, which is when someone expects the worst possible outcome to unfold. While the connection suddenly cuts out mid-conversation before Wysa can provide a solution, it’s clear to me that I am overthinking, although I do wonder how I’d cope with a sudden shut down if I had a longer issue to discuss.

Dealing with suicidal thoughts

I can’t remember a time in my life when I haven’t battled suicidal thoughts during certain events and these demons have returned after yet another relationship breakdown. 

Crying my eyes out, I admit to Wysa that I don’t want to be alive anymore. Its response is utterly heartwarming. “Nic, you are worth life. You are loved, cherished and cared for, even though you may not feel that way right now.”

With my eyes firmly fixed on these kind, AI-generated words, I realise that suicide isn’t the best course of action and that life is probably worth living. Concerned about my wellbeing, the bot provides me with a phone number for the Samaritans

The bots words were a comfort to Nicholas (Picture: Supplied)

Battling social anxiety 

While I’m okay seeing family and friends, the thought of encountering neighbours and other acquaintances frightens me. Turning to my app, I explain that I never know what to say to people. This is a feeling I experience day in and day out due to my autism

The advice given is constructive – just a simple smile or hello should do the trick. Although it may sound too simple to be true, I find it helpful because it shows that I don’t have to converse long with a stranger. 

Need support?

For emotional support, you can call the Samaritans 24-hour helpline on 116 123, email jo@samaritans.org, visit a Samaritans branch in person or go to the Samaritans website.

If you’re a young person, or concerned about a young person, you can also contact PAPYRUS, the Prevention of Young Suicide UK.

Their HOPELINE247 is open every day of the year, 24 hours a day. You can call 0800 068 4141, text 88247 or email: pat@papyrus-uk.org.

Seeing old faces

Today is my nephew’s christening, and while I am excited to celebrate with my loved ones, I’m nervous about seeing loads of new and old faces. 

To build on the previous social anxiety tips,  I message the bot for advice on how I could make the day less overwhelming. Wysa quickly reassures me that it’s normal to find social events nerve-racking. 

I explain I never know how to start or maintain a conversation. Wysa recommends that I say something like it’s nice to see them and ask how they are. And if they ask how I am doing, the bot recommends saying something simple like, “I’ve been doing well, thanks”. 

I’m told a breathing exercise beforehand might also help, which helps me feel better prepared. 

Stressed Man with smartphone and coffee cup
‘I am particularly worried as I never know how to start or maintain a conversation,’ says Nicholas  (Picture: Getty Images)

Facing up to night-time terrors 

Ever since moving onto the maximum dosage of Sertraline a few weeks ago, I’ve been having nightmares most nights. 

From plane crashes to loved ones getting gravely ill, these horrible and random dreams have been disrupting my sleep pattern for weeks. After explaining to my AI therapist that these nightmares started after the change of medication, it admits that this is likely the cause and we go through another thought reframing exercise.

We speak about a recent dream involving my parents dying, which is a frequent worry of mine, as morbid as it sounds. 

The AI therapist tried thought reframing with Nicholas (Picture: Supplied)

Wysa says this is likely another symptom of catastrophising, but then the chat suddenly ends due to a connection error. I am left not knowing how to tackle these traumatising dreams, which leaves me feeling pretty let down and not sure what to do next.

Dealing with compulsions

Today, my latest impulse TikTok Shop purchase arrived in the post: a magic mop, which is perhaps the last thing you should buy when you have severe OCD.

I’ve already used it several times today, but I still think my floors are dirty, so I ask for OCD advice. The first thing the bot says to me is that it must be exhausting – and they’re right. I can’t believe I feel heard by an AI bot. 

We do another thought exercise where I discuss how my OCD makes me feel. Wysa says it sounds like a symptom of filtering, where someone focuses on the negative details of a situation and forgets all the positives. 

In this context, it says I could be looking for tiny specs of dirt that may not exist and tells me to remember that the majority of the floor is probably clean. This makes me feel better – for now at least, although I’m more than aware it’s a plaster rather than a cure. 

I used an AI therapist for a month - here's what happened
But did it make a difference to Nicholas? (Picture: Supplied)

So was it worth it? 

While I don’t think AI can ever replace human psychologists and counsellors,  I’m surprised to admit that Wysa is actually a pretty handy tool and you sometimes forget you’re talking to a robot, not a human. 

Of course, it isn’t perfect. There were many times when a chat would suddenly end and when Wysa’s advice was repetitive. I alsofeel a bit paranoid that I’ve shared so much personal information with an AI chatbot, so I hope it is genuinely safe and secure. 

Either way, I had someone to speak to at some genuinely hard times, and I will continue using Wysa as an emotional support cushion. 

'We can't let AI therapists become acceptable'

Metro’s Assistant Lifestyle Editor Jess Lindsay believes we need to be far more wary of letting a bot look after our mental health. Here, she explains why.

‘In my opinion, an AI therapist is no more helpful than a list of motivational quotes. The bot may be able to say the right things, but when you’re at your lowest, you need more than hollow platitudes from a computer that doesn’t have the capacity to empathise.

Having dealt with chronic depression, anxiety, and ADHD throughout my life, I find the idea of having to receive help from a computer somewhat dystopian, and I’d feel like my concerns were being dismissed if this was offered to me – even as a supplementary solution.

Jess fears AI doesn’t have the ‘capacity to empathise’ in the same way as a human therapist (Picture: Getty Images)

Working through difficult issues requires a level of commitment from both yourself and the therapist, and why should I put in the effort when the other side is just a machine doing what it’s been programmed to do? Not only that, I know how to calm myself down when I’m having a panic attack, or take a walk when I’m stuck in my own head. To parrot NHS guidelines back to me without going deeper into why I feel like that seems like an insult to my intelligence.

While I absolutely understand the need for something to fill the gap when therapy and counselling is difficult to come by on the NHS, I worry that tools like this will be touted by the government as an acceptable (but most importantly in the eyes of government, cheaper) alternative when what’s desperately needed is funding and investment in the country’s mental health.

Even if AI is helpful to some, it’s a mere sticking plaster on a deeper societal wound.’

A version of this article was first published in September 2024.

Leave a Reply

Your email address will not be published. Required fields are marked *