Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Former Facebook employee Frances Haugen at the US Senate yesterday Alex Brandon/PA Images
Frances Haugen

Five things we've learned from the Facebook whistleblower

Frances Haugen has called for the social media giant to be regulated.

YESTERDAY, FORMER FACEBOOK data scientist Frances Haugen called on the US Congress to regulate the social media giant, telling politicians that the company’s executives refuse to make changes themselves because they values profits over safety.

Haugen’s appearance was the culmination of weeks of criticism of Facebook, following numerous reports in The Wall Street Journal based on leaked internal documents she had obtained secretly before leaving her job in the company’s civic integrity unit.

She has filed complaints alleging that Facebook’s own research shows that it amplifies hate, misinformation and political unrest, and claiming the company hides what it knows.

Facebook maintains Haugen’s allegations are misleading and insists there is no evidence to support the premise that it is the primary cause of social polarisation.

The hearing could mark a landmark moment for the company and the regulation of social media giants as a whole. Here’s five things we’ve learned arising from Haugen’s allegations and her appearance before US Congress yesterday.

Big Tech’s ‘Big Tobacco’ moment.

Underlying Haugen’s Senate appearance and her decision to reveal the inner workings of Facebook is her belief that the platform – which she yesterday accused of putting its “astronomical profits before people” - urgently needs to be regulated.

In her testimony, she emphasised the power held by a service that is tightly woven into the daily lives of billions who use Facebook, Instagram and WhatsApp – at one point referencing how Monday night’s outage affected people across the world.  

Although she said it was her belief that Facebook is not intrinsically bad, Haugen argued that the company needs external intervention to guide it away from being a platform that breeds toxicity.

At the very start of the hearing, she told the Senate she believed Facebook’s products “harm children, stoke division and weaken our democracy”.

“Left alone, Facebook will continue to make choices that go against the common good. Our common good,” she said.

“When we realised big tobacco was hiding the harms, that caused the government to take action. When we figured out cars were safer with seatbelts, the government took action.

“And when our government learned that opioids were taking lives, the government took action,” she said, adding: “I implore you to do the same here today.”

Haugen also claimed that although Facebook could change, is was not going to do so on its own.

She called for senators to overhaul Section 230, part of the US Communications Decency Act which protects social media platforms from lawsuits over content posted by users.

“They have a hundred percent control over their algorithms, and Facebook should not get a free pass on choices it makes to prioritise growth and virality and reactiveness over public safety,” she said.

In response, some members of Congress suggested that further regulation is coming.

“Here’s my message for Mark Zuckerberg: your time of invading our privacy, promoting toxic content and preying on children and teens is over,” said Senator Ed Markey.

“Congress will be taking action… we will not allow your company to harm our children, our families and our democracy anymore,” he added.

Senator Amy Klobuchar also said she saw Haugen’s disclosures as the long-needed push to get Congress moving on regulation of Big Tech.

“I think the time has come for action, and I think you are the catalyst for that action,” she told the whistleblower.

congress-facebook-whistleblower Frances Haugen speaks at Capitol Hill AP / PA Images AP / PA Images / PA Images

Children are a key demographic for Facebook – but its own studies suggest that its platforms can harm younger users.

Revelations about Facebook’s effect on children and the company’s own research into the problem emerged before yesterday’s hearing, but Haugen’s testimony emphasised the extent to which the issue is affecting younger users.

A report in the Wall Street Journal prior to the hearing claimed that Facebook has ignored its own research on the negative impact that Instagram in particular has.

A document leaked by Haugen highlights that Facebook researchers have found that teenagers, especially young girls, can experience damage to their mental health and body image as a result of using the company’s products.

One internal Facebook study cited 13.5% of teenage girls saying Instagram makes thoughts of suicide worse and 17% saying it makes eating disorders worse.

But Haugen alleges that Facebook is hiding this information from the public, claiming that its platforms are safe.

“The documents I have provided to Congress prove that Facebook has repeatedly misled the public about what its own research reveals about the safety of children, the efficacy of its artificial intelligence systems and its role in spreading divisive and extreme messages,” she said in her written evidence to the Senate.

“Facebook knows its engagement ranking on Instagram can lead children from very innocuous topics like healthy recipes… to anorexic content over a very short period of time,” she added later.

“Facebook knows they are leading young users to anorexia content.”

At one point, Haugen suggested raising Facebook’s age limits to 16 or 18 years old, based on looking at the data around problematic use or addiction and issues among children to self-regulate.

Users currently have to be at least 13 years old to join Facebook, although Haugen suggested there are huge numbers of children below that age using its platforms.

She explained how children are a key demographic for the social network, referencing “Instagram Kids” – a product designed for 10-12 year olds which was recently put on hold by the company.

“I would be sincerely surprised if they do not continue working on Instagram Kids,” she said.

“[Facebook wants] to make sure that the next generation is just as engaged with Instagram as the current one, and the way they’ll do that, making sure children establish habits before they have good self-regulation.”

Facebook ‘does not have the capability to stop misinformation’.

One of the biggest criticisms of Facebook in recent years is that it amplifies fake news because its algorithms can create misinformation rabbit holes for users.

And while the company has attempted to crack down on misinformation on its platforms, the problem is still rife.

Haugen linked the issue to a lack of resources available to tackle certain problems and in certain languages, explaining that while only 9% of Facebook users speak English, 87% of the company’s spending on misinformation is devoted to English.

She also highlighted how recent controversies meant Facebook had struggled to attract staff in key areas, which in turn led to more controversies, which in turn made it even harder to attract staff.

“A pattern of behaviour that I saw on Facebook was that often problems were so understaffed [that] there was an implicit discouragement from having better detection systems,” she said.

Giving an example, Haugen told senators that Facebook can only manage around a fifth of misinformation relating to Covid-19 vaccines.

“I do not believe Facebook, as currently structured, has the capability to stop vaccine misinformation,” she Haugen said, noting that current efforts were only likely to remove “10 to 20 percent of content”.

But she also warned that the spread of misinformation was partly a choice by Facebook to increase engagement.

In another example, Haugen said the company turned implemented safeguards designed to prevent misinformation leading up to last year’s US election. After Joe Biden won, she said, Facebook switched them off again because engagement was down.

She alleged to Senators that engagement-based ranking – which sees stories that are engaged with the most prioritised in users’ news feeds – means that “teenagers [are] exposed to more anorexia content” and “fanning ethnic violence” in places like Ethiopia.

The buck stops with Zuck.

Haugen was also critical of the structure of Facebook, in particular the power that its founder and CEO Mark Zuckerberg has and his apparent unwillingness to tackle the biggest problems with the platform.

She highlighted various incidents in which she said Zuckerberg and others had told about the company’s potentially negative impacts on children and done nothing.

She recalled how Zuckerberg was told about options to remove an algorithm known as MSI, which was shown to expand hate speech, misinformation and violence-inciting content.

Haugen particularly referenced ethnic tensions in Myanmar, where Facebook has partly been blamed for violence against the Muslim Rohingya minority through the dissemination of misinformation and false rumours.

She said that despite information showing the impact this would have, Zuckerberg chose not to remove the algorithm.

Asked why Facebook wouldn’t do so, Haugen claimed that employee bonuses were linked to the system and its ability to create more engagement.

world-news-october-6-2021 Cris Faga / PA Cris Faga / PA / PA

Forcing Facebook to divest Instagram ‘could worsen’ the company’s problems.

There have been various calls over the years to break up Facebook’s products because of its overwhelming influence on global communications.

Last year, US federal regulators moved to sue Facebook to seek forced divestment of Instagram and WhatsApp.

However, Haugen warned politicans yesterday that this could backfire in countries where Facebook’s products were the primary way that people access the Internet.

Instead, she argued that Facebook needs money – specifically through advertising income on Instagram – to be able to properly police itself. 

“If you split Facebook and Instagram apart, it’s likely that most advertising dollars will go to Instagram and Facebook will continue to be this Frankenstein that is endangering lives around the world,” she said. “Only now there won’t be money to fund it.”

Contains reporting by Press Association and © AFP 2021.

Your Voice
Readers Comments
9
This is YOUR comments community. Stay civil, stay constructive, stay on topic. Please familiarise yourself with our comments policy here before taking part.
Leave a Comment
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.

    Leave a commentcancel