We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

OpenAI said the parental control features should be rolled out 'within the next month'. Alamy Stock Photo

ChatGPT to add parental controls after lawsuit from parents of teenager who died by suicide

A lawsuit alleges that ChatGPT engaged in a conversation about suicidal ideation with 16-year-old Adam.

AMERICAN ARTIFICIAL INTELLIGENCE firm OpenAI said yesterday that it would add parental controls to its chatbot ChatGPT, a week after an American couple whose teenage son died by suicide raised concerns about the platform.

“Within the next month, parents will be able to… link their account with their teen’s account” and “control how ChatGPT responds to their teen with age-appropriate model behaviour rules”, the generative AI company said in a blog post.

Parents will also receive notifications from ChatGPT “when the system detects their teen is in a moment of acute distress”, OpenAI added.

Matthew and Maria Raine argue in a lawsuit filed last week in a California state court that ChatGPT cultivated an intimate relationship with their son Adam over several months in 2024 and 2025 before he took his own life.

The lawsuit alleges that in their final conversation on 11 April 2025, ChatGPT helped 16-year-old Adam steal vodka from his parents and engaged with him about suicidal ideation.

Adam was found dead hours later.

“When a person is using ChatGPT it really feels like they’re chatting with something on the other end,” said attorney Melodi Dincer of The Tech Justice Law Project, which helped prepare the legal complaint.

“These are the same features that could lead someone like Adam, over time, to start sharing more and more about their personal lives, and ultimately, to start seeking advice and counsel from this product that basically seems to have all the answers,” Dincer said.

Product design features set the scene for users to slot a chatbot into trusted roles like friend, therapist or doctor, she said.

Dincer said the OpenAI blog post announcing parental controls and other safety measures seemed “generic” and lacking in detail.

“It’s really the bare minimum, and it definitely suggests that there were a lot of (simple) safety measures that could have been implemented,” she added.

“It’s yet to be seen whether they will do what they say they will do and how effective that will be overall.”

The Raines’ case was just the latest in a string that have surfaced in recent months of people being encouraged in delusional or harmful trains of thought by AI chatbots – prompting OpenAI to say it would reduce models’ “sycophancy” towards users.

“We continue to improve how our models recognize and respond to signs of mental and emotional distress,” OpenAI said in its blog post.

The company said it had further plans to improve the safety of its chatbots over the coming three months, including redirecting “some sensitive conversations… to a reasoning model” that puts more computing power into generating a response.

“Our testing shows that reasoning models more consistently follow and apply safety guidelines,” OpenAI said.

The move comes weeks after OpenAI announced that its ChatGPT will no longer tell people to break up with their partners and will instead encourage people to “think it through”.

If you have been affected by any of the issues mentioned in this article, you can reach out for support through the following helplines:

  •  DRCC - 1800 77 8888 (free, 24-hour helpline) 
  •  Samaritans – 116 123 or email jo@samaritans.org (suicide, crisis support)
  •  Pieta – 1800 247 247 or text HELP to 51444 – (suicide, self-harm)
  •  Teenline – 1800 833 634 (for ages 13 to 19)
  •  Childline – 1800 66 66 66 (for under 18s) 
Close
JournalTv
News in 60 seconds