Readers like you keep news free for everyone.

More than 5,000 readers have already pitched in to keep free access to The Journal.

For the price of one cup of coffee each week you can help keep paywalls away.

Support us today
Not now
Thursday 28 September 2023 Dublin: 13°C
# Suicide
'We will not allow any graphic images of self-harm, such as cutting on Instagram'
The CEO of the company says it needs to do more.

INSTAGRAM HAS ANNOUNCED it is clamping down on images related to self-injury such as cutting.

The move came after British Health Secretary Matt Hancock met with social media companies about doing more to safeguard the mental health of teenagers using their platforms.

British teenager Molly Russell was found dead in her bedroom in 2017. The 14-year-old had apparently taken her own life, and her Instagram account reportedly revealed she followed accounts related to depression and suicide.

“It is encouraging to see that decisive steps are now being taken to try to protect children from disturbing content on Instagram,” said the girl’s father, Ian Russell.

“It is now time for other social media platforms to take action to recognise the responsibility they too have to their users if the internet is to become a safe place for young and vulnerable people.”

Changes to Instagram’s self-harm content rules follow a comprehensive review involving experts and academics from around the world on youth, mental health, and suicide, according to chief executive Adam Mosseri.

Downplaying self-damage

“Over the past month, we have seen that we are not where we need to be on self-harm and suicide, and that we need to do more to keep the most vulnerable people who use Instagram safe,” Mosseri said in an online post.

“We will not allow any graphic images of self-harm, such as cutting on Instagram – even if it would previously have been allowed as admission.”

Instagram has never allowed posts that promote or encourage suicide or self-harm.

The Facebook-owned service is removing references to non-graphic content related to people hurting themselves, such as healed scars, from search, hashtag, explore, or recommendation features.

“We are not removing this type of content from Instagram entirely, as we don’t want to stigmatise or isolate people who may be in distress and posting self-harm related content as a cry for help,” Mosseri said.

Instagram also planned to ramp up efforts get counseling or other resources to people who post or search for self-harm related content.

“During the comprehensive reviews, the experts, including the Centre for Mental Health and reaffirmed that creating safe spaces for young people to talk about their experiences — including self-harm — online, is essential,” Mosseri said.

However, collectively it was advised that graphic images of self-harm — even when it is someone admitting their struggles — has the potential to unintentionally promote self-harm.

Instagram’s aim is to eliminate graphic self-injury or suicide related imagery and significantly downplay related content in features at the service while remaining a supportive community, according to Mosseri.

Yesterday, Mosseri joined representatives from Facebook, Google, Snapchat, Twitter and other companies who met with Hancock to discuss handling of content related to self-injury or suicide.

“What really matters is when children are on these sites they are safe. The progress we made today is good, but there’s a lot more work to do,” Hancock said after the meeting.

“What all the companies that I met today committed to was that they want to solve this problem, and they want to work with us about it.”

If you need to talk, support is available:

  • Samaritans 116 123 or email
  • Aware 1800 80 48 48 (depression, anxiety)
  • Pieta House 1800 247 247 or email (suicide, self-harm)
  • Teen-Line Ireland 1800 833 634 (for ages 13 to 19)
  • Childline 1800 66 66 66 (for under 18s)

A list of HSE and HSE-funded services can be found here

© – AFP 2019

Your Voice
Readers Comments
This is YOUR comments community. Stay civil, stay constructive, stay on topic. Please familiarise yourself with our comments policy here before taking part.
Leave a Comment
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.

    Leave a commentcancel