We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Cork graduate Tom Barton shares his story of being radicalised on social media before realising the tools used by companies to push users to engage with harmful content. Alamy Stock Photo

Young man I was radicalised by social media. We need to stop it happening to others

EU Commissioner Michael McGrath has the power to prevent algorithms radicalising young men, writes Tom Barton.

AS A 24-YEAR-OLD, social media is an unavoidable presence in my life.

Using it certainly comes with some benefits. It helps me to stay in touch with far-away friends and family, stay informed and discover new things. I mean, how else would I have come across “Catman”, the Japanese construction worker walking across Australia in a cheetah costume?

But with the good comes the bad. I have first-hand experience of what powerful social media algorithms can do to vulnerable young minds.

I was around sixteen when I first logged on.

Like many teenagers, I was insecure, impressionable and craved validation. I didn’t know it at the time, but these vulnerabilities made me a prime target for Silicon Valley’s billion-dollar algorithms.

Over the course of three or four years, videos from the likes of Ben Shapiro, Jordan Peterson and Andrew Tate systematically popped up in my recommended feeds.

Ben Shapiro is a willing poster boy for the American anti-trans movement and has frequently expressed the view that racial inequalities in the United States were the result of “Black culture” rather than historical or structural racism.

Jordan Peterson – who recently appeared alongside Shapiro on a podcast with Israel’s Benjamin Netanyahu – is a Canadian clinical-psychologist-turned-conservative-icon. He is famous for refusing to call trans people by their preferred pronouns and has argued – using an analogy that compares humans to lobsters – that social inequality is not just inevitable, but desirable.

Then there’s Andrew Tate, famous for bragging about hitting women and for showing off his expensive cars and clothes. He thinks depression is “not real” or “gay” and is currently under investigation on charges of human trafficking, money laundering and rape.

Harmful and hateful ideas like these were deliberately targeted at me.

They gave me simple answers to complicated questions and played on my fears, insecurities and ego.

The more I saw, the more I believed. And the more I believed, the more time I spent scrolling.

How the algorithm served up hateful content

I found myself getting sucked into Ben Shapiro’s warped ideas that Black people’s oppression was their own fault. I found myself agreeing with Jordan Peterson that inequality was a good thing. If rich white men were at the top of society, I thought, it’s because we were the best – it had nothing to do with systems and structures of oppression. Finally, I found myself resonating with Andrew Tate’s narrative that women were put on earth to tempt otherwise virtuous men – after all, was it not Eve who ate the forbidden fruit in the Garden of Eden!

This is difficult to share. Looking back, I feel ashamed and embarrassed by things I said and believed. But – without excusing myself too much – I recognise that I was immature, ignorant and easily captivated by simple narratives rooted in emotions like fear, insecurity and anger.

ben-shapiro-host-of-his-online-political-podcast-the-ben-shapiro-show-at-the-conservative-political-action-conference-cpac-sponsored-by-the-american-conservative-union-held-at-the-gaylord-national Ben Shapiro's social media account was among those pushed onto Tom's feed. Alamy Stock Photo Alamy Stock Photo

From the moment I started using them, the platforms were collecting data on me. They tracked everything that I did. Over time, they were able to build a frighteningly accurate profile of me.

I never searched for Ben Shapiro, Jordan Peterson, or Andrew Tate. I didn’t follow them, and I certainly didn’t ask to see their content.

Regardless, the algorithm knew that when people like me saw content like that, they stuck around longer. It didn’t know or care that it was nudging me toward racism, misogyny, and transphobia. All it cared about was that I kept watching.

At the time, I didn’t notice how much my thoughts and beliefs were being distorted.

Only now, with the benefit of hindsight, do I understand that social media is not designed to “inform”, “connect” or bring out the best of us – it’s built to harvest attention from users and then sell that attention to advertisers.

Trusting social media companies to “do the right thing” and put people’s wellbeing over their own financial interests is like trusting the fox to take care of the henhouse – it’s naïve and dangerous.

Defending against digital threats

Fortunately, there’s hope.

Fellow Cork man, Michael McGrath, is the EU Commissioner for democracy, justice, rule of law, and consumer protection. It’s his responsibility to oversee the Democracy Shield, Europe’s defence against digital threats.

The most meaningful step he can take is to require that recommender algorithms be switched off by default.

Crucially, this doesn’t mean banning recommendation systems altogether. People who want to keep using them can do just that.

The point is to put control back in users’ hands, meaning it’s up to us whether or not we get fed content by the recommender system.

The problem with social media isn’t simply the harmful content itself; it’s the fact that harmful content gets amplified by algorithms that are designed to prioritise engagement over all else.

Switching off recommender algorithms by default tackles the problem at its source.

It would give users more control over what appears in their social media feeds.

Of course, McGrath is under pressure from those who benefit from social media’s harmful design. Trump has explicitly warned Europe against regulating American companies. This is hardly surprising, given the outsized role that social media corporations played in his re-election.

Trump and his allies want to push the idea that giving users the ability to opt out of algorithmic manipulation is somehow an attack on freedom. But what is “free” about a system that hijacks the darkest parts of human psychology to keep us scrolling longer?

Michael McGrath has a clear choice to make. He can use the Democracy Shield to do what it’s supposed to – protect European democracy. Or, he can safeguard the profits of Mark Zuckerberg, Elon Musk and co.

Michael McGrath has the power make our online spaces safer and healthier, or to let another generation fall victim to the same algorithmic manipulation that I did.

Tom Barton is a recent graduate from Cork. He is a member of campaigning community Uplift and an active part of the campaign to make social media safer for everyone.

Readers like you are keeping these stories free for everyone...
A mix of advertising and supporting contributions helps keep paywalls away from valuable information like this article. Over 5,000 readers like you have already stepped up and support us with a monthly payment or a once-off donation.

Close
JournalTv
News in 60 seconds