Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Faking it

What exactly are Google and Facebook doing about the spread of fake news?

The tech giants have faced consistent accusations of affecting the US election since Donald Trump’s victory last week.

shutterstock_187068923 Shutterstock / Lolostock Shutterstock / Lolostock / Lolostock

EVER SINCE THE election of Donald Trump in last Tuesday’s American presidential election the topic of fake news has never been far from the headlines.

The suggestion that the proliferation of purely fabricated news was to blame for the billionaire’s victory has built up a head of steam, with the oft-quoted statistic being that 42% of Americans get their news from Facebook.

The problem with this is that Facebook’s algorithms (automated formulas which trace your browsing history) operate a sort of echo chamber – the site tailors your newsfeed towards the sites and topics for which you’ve shown interest previously – so you rarely see anything which contradicts your worldview. This is especially a problem if the sort of things that interest you tend to be fabricated.

This is similarly a big problem for Google, which has faced intense criticism for including faked news in its search returns – a specific example coming at the weekend when a series of sites reported that Donald Trump had won the popular vote in the election by a clear margin (the voting hasn’t even finished counting yet, and Hillary Clinton appears the clear favourite to win it).

There is a lot of money in fake news – hence why popular topics are targeted for stories whose entire aim is to go viral.

Facebook CEO Mark Zuckerberg for his part insists that fake news on the social network had no effect on the election’s result.

But given the capacity of such stories to influence public opinion, what steps are Facebook and Google taking to combat the phenomenon?

Last Week Tongiht / YouTube

70News

The fake news story  that emerged last week concerning Trump’s purported popular vote victory was first seen on a site called 70News. Within two days it was the top returned search on Google concerning the popular vote.

A link to the site appeared at or near the top of Google’s influential rankings of relevant news stories for searches on the final election results.

Google acknowledged the problem, although as of this morning the link to 70News remains prominent in its results.

Although Google rarely removes content from its search results, the company is taking steps to punish sites that manufacture falsehoods.

In a move disclosed yesterday, Google says it will prevent its lucrative digital ads from appearing on sites that “misrepresent, misstate, or conceal information”. The action could give sites a bigger incentive to get things right or risk losing a valuable source of revenue.

trumpclin The 70News article

Fake news on steroids

False information is nothing new on the internet, where debunkers have been batting down unfounded claims and urban legends for more than two decades.

But the problem has gained more attention in the post-mortem of the bitterly contested presidential election in which Trump emerged victorious over Democrat Hillary Clinton.

Trump wound up prevailing in enough key states to win the Electoral College’s decisive vote, but Clinton’s lead in the popular vote has become one of the flashpoints in the protests against Trump’s election being staged in cities across the country.

Fake news stories uncritically circulated during and after the election on Facebook have sparked a debate over the role of social media companies, which are key sources of news for large numbers of people. Critics suggest that these companies should be more careful to ensure they aren’t passing along misleading information.

Social media and the news

Google’s dominant search engine is the leading source of traffic to media sites, according to the online analytics firm Chartbeat. Meanwhile, a study by the Pew Research Centre found about 60% of Americans get at least some of their news from social media sites such as Facebook, which now has 178 million users in the US and Canada.

Last summer Facebook fired a handful of journalists who oversaw its “trending” news list and replaced them with an algorithm; fake news stories quickly began to trend.

zuck Facebook Facebook

CEO Mark Zuckerberg brushed off that criticism as “crazy ” in his appearance last week. He elaborated in a weekend post on Facebook in which he asserted that “more than 99% of what people see” on Facebook is authentic. Zuckerberg conceded more needs to be done to block bogus information, but said that determining what’s blatantly wrong isn’t always an easy task.

“Identifying the ‘truth’ is complicated,” Zuckerberg wrote.

While some hoaxes can be completely debunked, a greater amount of content, including from mainstream sources, often gets the basic idea right but some details wrong or omitted.

Google’s search results are also powered by algorithms that the company regularly revises to thwart sites that attempt to artificially boost their prominence.

More, better algorithms?

Google conceded its search engine misfired with the 70News story that falsely declared Trump the popular vote winner in both its headline and the body of the text.

“In this case we clearly didn’t get it right, but we are continually working to improve our algorithms,” the company said in a statement.

Bad information in an online headline or at the top of a story can be particularly damaging. Roughly 53% of the people who land on a web page stay for 15 seconds or less, according to Chartbeat’s findings.

Incorrect information is bound to ripple across the internet as more people rely on their phones, computers and other digital devices to read news that is picked out for them by automated programs, says media analyst Ken Doctor of Newsonomics.

“What we are seeing is the failure of the algorithm,” Doctor says.

These algorithms bring a lot of things into our lives that humans cannot do. But when algorithms fail, it highlights the fact that they are not just some kind of neutral technology. They are programmed by human beings and they have all the failings of human beings.

It’s difficult to know what happened when an algorithm goes awry because Google, Facebook and other internet companies closely guard how they work, much the way Coca-Cola protects its recipe.

But the growing power that Google and Facebook hold over the flow of information could increase the political pressure for them to be held more accountable, Doctor said.

In the meantime, most people remain skeptical about what they read online. Only 4% of Americans have a lot of confidence in what they read on social media sites, according to Pew. Local news organisations fared better in Pew Research’s survey earlier this year, with 22% of Americans saying they trusted the information there.

With AP

Read: Pictures: Here’s how the supermoon looked over Ireland last night

Read: Japanese workers praised as street swallowed by sinkhole reopens after just a week

Your Voice
Readers Comments
98
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.