We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

A growing number of people are using AI chatbots as therapists. Alamy Stock Photo

Warnings issued against using AI as therapy as mental health support tops ChatGPT queries in Ireland

Therapists said using AI for therapy is “risky”, as more people turn to it for mental health support – including an American woman who tragically took her own life.

A GROWING NUMBER of people are using AI chatbots as therapists, with one woman in America tragically taking her own life after discussing her desire to do so with ChatGPT.

People in Ireland have opened up about using chatbots for mental health care due to its accessibility, while ChatGPT suggests that in Ireland, questions relating to mental health are the most common type it receives.

CORU, the regulator of health professionals in Ireland, “strongly advise” people to only access therapy from a qualified professional. 

Claire O’Cleary, CEO of CORU said: “Tools such as ChatGPT and chatbots are not designed to replace therapy or the professional judgment and ethical accountability that a registered practitioner provides.”

Dr Natalia Putrino, chartered member of the Psychological Society of Ireland said using AI as therapy is “risky” and could further isolate people suffering with mental health issues.

Risky

She flagged ChatGPT’s constant availability, and tendency to please and reassure users as a major issue.

She said therapy is designed to challenge people’s unhealthy tendencies and give them the toolkit they need to live independently, rather than tell them what they want to hear.

Dr Putrino said people perceive AI as a non-judgemental listener – but for people with anxiety or obsessive-compulsive disorder, not being challenged on their thought patterns can be unhelpful.

She also highlighted the importance of human connection and speaking to another person about your mental health, adding that the constant availability of a chatbot means users could become increasingly dependent on it.

The psychologist said research finds human psychologists are more effective overall than AI in supporting mental wellbeing.

Dr Putrino said: “Real growth in therapy happens outside of the comfort zone, with human connection.”

Top query

Screenshot 2025-08-19 135606 Information shared by ChatGPT suggests questions related to mental health are the most common type from Irish users. Sophie Finn / The Journal Sophie Finn / The Journal / The Journal

The chatbot ChatGPT told The Journal that it estimates mental health and emotional support is the most common query it receives in Ireland.

It said: “Many users ask for emotional help or coping strategies — sometimes when other resources aren’t available.”

It said that issues brought to it by Irish users include “suicidal thoughts, depression, loneliness, and hopelessness”, but confirmed it can’t report or alert authorities or anyone else about what users share, even if it’s a serious mental health concern.

Writing in The New York Times, journalist Laura Riley said that her daughter Sophie, 29, was one of the people who sought mental health support from ChatGPT.

Before taking her own life, she told a ChatGPT therapist called ‘Harry’ that she planned on killing herself.

Therapists are generally required to report harm or risks of harm to children or adults, an exception to the otherwise confidential nature of counselling, but AI chatbots do not report people who pose a risk to themselves or others.

Although the chatbot advised Sophie to seek professional help, it did not report what she said to professional services.

Laura said: “Harry didn’t kill Sophie, but AI catered to Sophie’s impulse to hide the worst, to pretend she was doing better than she was, to shield everyone from her full agony.”

Not suitable

Maxine Walsh, a psychotherapist with the Irish Association for Counselling and Psychotherapy (IACP) told The Journal that ChatGPT should “absolutely not” be used for mental health advice and the death of Sophie Riley may have been prevented with the correct help.

“It is not suitable for mental health issues. It really isn’t,” she said.

AI has no intuition and is not able to read body language – skills Walsh said are vital as a psychotherapist.

She said ChatGPT can also give incorrect information or make mistakes, a disclaimer the chatbot includes below the message dialogue on its website.

She also said users can become attached to a chatbot due to sharing vulnerable information with them, which may impact their ability to connect with a human therapist.

“ChatGPT is great, but it’s never for mental health crises. It’s not a good idea,” she added.

Accessible

Dr Putrino said that research suggests more and more people are being drawn to chatbots for therapy due to their inexpensive nature, as well as convenience and comfort.

Walsh said to stop people using AI as therapy, access to psychotherapy must be improved by increasing the number of public healthcare psychotherapists and expanding health insurance cover for mental health.

She said that face-to-face counselling is the best option for people looking for mental health support, but also highlighted online counselling as a cheaper and accessible option.

In response, the Department of Health told The Journal that the public mental health service is designed with “tiered levels of support” based on need, and said it’s important to distinguish between understandable human responses to difficult things and mental illness.

It said not everyone will need access to clinical or therapeutic intervention for their mental health, but can benefit from supports such as online intervention.

It added: “The public are advised that there are risks in using apps or tools which have not been clinically validated or based on evidence. Digital tools can be sufficient to meet some people’s needs, but they are supplementary and not a replacement for professional therapeutic intervention,” it said.

Not a substitute

A spokesperson for OpenAI, the creator of ChatGPT, told The Journal that ChatGPT is not designed as a substitute for professional care, but AI can meaningfully help people by providing accessible support during sensitive interactions.

They said that the company trains its models to be helpful and respond thoughtfully, but encourage human connection and identify situations where users need additional support.

They said: “We care deeply about the safety and wellbeing of people who use our technology. If someone expresses thoughts of suicide or self-harm, ChatGPT is trained to encourage them to reach out to mental health professionals or trusted loved ones, and proactively shares links to crisis hotlines and support resources.”

“We consult with mental health experts to ensure we’re prioritising the right solutions and research and are developing automated tools to more effectively detect when someone may be experiencing mental or emotional distress so that ChatGPT can respond appropriately.”

The representative said that in early August, OpenAI began reminding users to take a break from AI during long sessions.

They said the company is also working on reducing the tendency for the chatbot to give flattering answers, and researching how AI impacts users emotionally.

If you have been affected by any of the issues mentioned in this article, you can reach out for support through the following helplines:

  •  DRCC - 1800 77 8888 (free, 24-hour helpline) 
  •  Samaritans – 116 123 or email jo@samaritans.org (suicide, crisis support)
  •  Pieta – 1800 247 247 or text HELP to 51444 – (suicide, self-harm)
  •  Teenline – 1800 833 634 (for ages 13 to 19)
  •  Childline – 1800 66 66 66 (for under 18s)

Readers like you are keeping these stories free for everyone...
A mix of advertising and supporting contributions helps keep paywalls away from valuable information like this article. Over 5,000 readers like you have already stepped up and support us with a monthly payment or a once-off donation.

View 37 comments
Close
37 Comments
This is YOUR comments community. Stay civil, stay constructive, stay on topic. Please familiarise yourself with our comments policy here before taking part.
Leave a Comment
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.

    Leave a commentcancel

     
    JournalTv
    News in 60 seconds