We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Alamy Stock Photo
Online Safety

Probe into Facebook and Instagram launched over possible online 'rabbit holes' affecting minors

The European Commission has claimed that the alleged rabbit holes may cause “addictive behaviour” in children.

META, THE PARENT company of Instagram and Facebook, is being investigated over possible online “rabbit holes” created on their platforms which may breach the EU’s strict rules for the safety of minors online.

The European Commission today announced it has opened formal proceedings against Meta for the alleged rabbit holes, and argues that they risk creating addictive behaviours for child users.

The Digital Services Act (DSA), adopted last year, introduced strict regulation and placed obligations on social media and other large online platforms to ensure the content on their websites did not breach the rights of EU citizens.

Breaching any of the regulations, which came into effect in February and include measures which stop the spread of misinformation and illegal content online, can run companies the risk of being fined 6% of their annual turnover.

Proceedings, launched today by the European Commission, are investigating three potential breaches of the DSA, after an initial assessment was completed earlier this year. 

The initial assessment found that the design of Facebook’s and Instagram’s online interfaces, “may exploit the weaknesses and inexperience of minors and cause addictive behaviour” through the creation of “rabbit holes”.

These exploits, the EU argues, potentially create risks for the rights of children in the EU to have good physical and mental wellbeing.

Meta’s age verification tools have also been accused by the Commission of not being “reasonable, proportionate and effective”, which may have led to young children viewing harmful content on its platforms.

The Commission is also investigating if Meta’s privacy, safety and security measures for minors are appropriate and proportionate – noting the default privacy settings for minors’ accounts as a particular concern.

An “in-depth investigation” will now be carried out by the Commissioner for Internal Market Thierry Breton’s office into these alleged breaches. Last month, the EU opened similar formal proceedings into TikTok.

If the investigation determines that a fine is necessary, Ireland’s Coimisúin na Meán will be required to seek the payment from Meta as the company is headquartered in Dublin.

Readers like you are keeping these stories free for everyone...
A mix of advertising and supporting contributions helps keep paywalls away from valuable information like this article. Over 5,000 readers like you have already stepped up and support us with a monthly payment or a once-off donation.