We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Haughen (L) and Labour Senator Annie Heey.
Frances Haugen

Facebook whistleblower tells TDs that harm from metaverse 'could be worse'

Facebook employee-turned-whistleblower Frances Haugen has made a number of blistering claims about the social network.

FACEBOOK WHISTLEBLOWER FRANCES Haugen has told an Oireachtas committee that the metaverse could do even more harm than traditional social media unless proper regulation is put in place. 

The metaverse is an internet centred in virtual reality, 3D environments that Facebook has heavily backed as its future. Facebook has changed its corporate name to Meta as a reflection of this shift. 

Facebook employee-turned-whistleblower Haugen last year made a number of blistering claims about the social network, among them that the company did not act on internal research about the harm caused by its products. 

For example, she claimed that the firm’s own research found that Facebook-owned Instagram is more dangerous than other social media such as TikTok and Snapchat, because the platform is focused on “social comparison” about bodies and lifestyles. 

Speaking to TDs and Senators at the Oireachtas Committee on Tourism, Culture, Arts, Sport and Media today, Haugen said the metaverse could present a new challenge for regulators because they will not be able to see the harm it causes. 

She said even more “immersive” technologies like the metaverse would make it even more difficult to identify harm.

“The metaverse is new and if we were sitting down and writing a protection plan today, we actually know very little about it,” she said.

I’m just going to lay out what could happen in the future if we don’t implement some recurrent system of mandatory risk assessments. We’re gonna see the exact same same thing happen in the metaverse, or maybe even worse, than we’ve seen already on places like Instagram.  

“People had been complaining about many of the issues in my disclosures for years, sometimes a decade. The only thing that changed was I brought forward proof that Facebook knew about these harms, because they were real. What we’re gonna see with the metaverse is people are going to start having individual problems,” she said.

People like pediatricians are going to start seeing things like kids getting more addicted to these immersive systems than they already are, like Instagram. But we’re not going to be able to prove it, because we’re all going to only see our individual experiences of things like the metaverse, we’re not going to see the systematic impact. 

A number of politicians also raised concerns about the metaverse with Fianna Fáil Senator Malcolm Byrne asking how politicians can legislate for it, with Sinn Fein’s Fintan Warfield saying that politicians were “still talking about 2D flat screens”.

Haugen agreed and said that Facebook was trying to “shift the conversation” to the metaverse.


In her opening statement to the committee, Haugen said that Ireland’s planned Online Safety Bill should include forced transparency as self-regulation of the social media giant does not work.

“There is a major, major, major national security problem with Facebook with regard to lack of transparency,” Haugen said.

She added that the government should order an independent review of the Data Protection Commission.

“As you create an independent, robust and effective online safety regulator, you must launch an independent review into the DPC so that it too can start to enforce the law thoroughly and boldly,” she said, adding that Ireland holds a “unique responsibility” because it is the place of establishment for these companies.

Again referencing the Online Safety Bill, Haugen said it must “focus less on content and more on how content is shared and spread”.

“You cannot simply rely on the deletion or criminalisation of harmful content. Not only because that risks infringing on free speech, but because it doesn’t work,” she said.

Regulatory regimes which have focused solely on deleting content have failed. There’s just too much out there. 

Under questioning from Fianna Fáil Senator Shane Cassells, Haugen acknowledged that Ireland’s regulator simply cannot hire the requisite expertise in online algorithms that Facebook has access to. 

Algorithms are automated technologies that are used by social media companies to tailor content for individual users. 

Haugen said that the Online Safety Commission would need at least 20 algorithm experts to tackle the tech giant, but that there perhaps “200 or 300″ such experts in the world.

“The idea that Ireland is going to be able to go out on the world stage and pay market competitive strategies on specialists like this, it’s an undue burden on the Irish people,” she said. 

Asked what one such expert could be paid, Haugen said their total compensation including salary and equity could be “in the order of half a million dollars to three-quarters of a million dollars.”

Your Voice
Readers Comments
This is YOUR comments community. Stay civil, stay constructive, stay on topic. Please familiarise yourself with our comments policy here before taking part.
Leave a Comment
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.

    Leave a commentcancel