Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Facebook CEO Mark Zuckerberg Eric Risberg AP/Press Association Images
liked and shared

This is how Facebook plans to crack down on fake news

The company said it will be focusing on the “worst of the worst” offenders and partnering with outside fact-checkers to sort honest news reports from made up stories.

FACEBOOK IS TAKING new measures to curb the spread of fake news on its huge and influential social network.

The company said it will be focusing on the “worst of the worst” offenders and partnering with outside fact-checkers to sort honest news reports from made up stories that play to people’s passions and preconceived notions.

Fake news stories touch on a broad range of subjects, from unproven cancer cures to celebrity hoaxes and backyard Bigfoot sightings.

Fake political stories have drawn attention because of the possibility that they influenced public perceptions and could have swayed the US presidential election. There have been other dangerous real-world consequences. A fake story about a child sex ring at a Washington DC pizza shop prompted a man to fire an assault rifle inside the restaurant, Comet Ping Pong.

“We do believe that we have an obligation to combat the spread of fake news,” John Hegeman, vice president of product management on news feed,  said in an interview.

He added that Facebook also takes its role to provide people an open platform seriously, and that it is not the company’s place to decide what is true or false.

The plan

To start, Facebook is making it easier for users to report fake news when they see it, which they can now do in two steps, not three. If enough people report a story as fake, Facebook will pass it to third-party fact-checking organisations that are part of the nonprofit Poynter Institute’s International Fact-Checking Network.

The five fact-checking organisations Facebook is currently working with are ABC News, The Associated Press, FactCheck.org, Politifact and Snopes. Facebook says this group is likely to expand.

Stories that flunk the fact check won’t be removed from Facebook. But they’ll be publicly flagged as “disputed”, which will force them to appear lower down in people’s news feed. Users can click on a link to learn why that is. And if people decide they want to share the story with friends anyway, they can — but they’ll get another warning.

By partnering with respected outside organisations and flagging, rather than removing, fake stories, Facebook is sidestepping some of the biggest concerns experts had raised about it exercising its considerable power in this area. For instance, some worried that Facebook might act as a censor — and not a skillful one, either, being an engineer-led company with little experience making complex media ethics decisions.

“They definitely don’t have the expertise,” Robyn Caplan, a researcher at Data & Society, a nonprofit research institute funded in part by Microsoft and the National Science Foundation, said. In an interview before Facebook’s announcement, she urged the company to “engage media professionals and organisations that are working on these issues”.

The problem

Facebook CEO Mark Zuckerberg has said that fake news constitutes less than 1% of what’s on Facebook , but critics say that’s wildly misleading. For a site with nearly two billion users tapping out posts by the millisecond, even 1% is a huge number, especially since the total includes everything that’s posted on Facebook — photos, videos and daily updates in addition to news articles.

In a study released today, the Pew Research Center found that nearly a quarter of Americans say they have shared a made-up news story, either knowingly or unknowingly. Some 45% said that the government, politicians and elected officials bear responsibility for preventing made-up stories from gaining attention, while 42% put this responsibility on social networking sites and search engines, and a similar percentage on the public itself.

Fake news stories can be quicker to go viral than news stories from traditional sources. That’s because they were created for sharing — they are clickable, often inflammatory and pander to emotional responses. Mike Caufield, director of blended and networked learning at Washington State University Vancouver, tracked whether real or fake news is more likely to be shared on Facebook.

He compared a made-up story from a fake outlet with articles in local newspapers. The fake story, headlined FBI Agent Suspected In Hillary Leaks Found Dead In Apparent Murder-Suicide from the nonexistent Denver Guardian, was shared 1,000 times more than material from the real newspapers.

“To put this in perspective, if you combined the top stories from the Boston Globe, Washington Post, Chicago Tribune, and LA Times, they still had only 5% the viewership of an article from a fake news,” he wrote in a blog post .

Facebook is emphasising that it’s only going after the most egregious fake news creators and sites, ”the clear hoaxes spread by spammers for their own gain,” Adam Mosseri , vice president of product for Facebook’s news feed, wrote in a blog post today.

Read: A magazine called Trump Bistro ‘rich-man slop’ so Trump said their editor had no talent

Read: A Colorado school district will allow teachers carry guns

Author
Associated Foreign Press
Your Voice
Readers Comments
50
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.