We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Alamy Stock Photo

Opinion Banning under-16s from social media is a half-measure. We should ban toxic algorithms

Australia is trying to ban teens from social media, but that only scratches the surface as a solution to the problems tech companies are causing us, writes Killian Mangan.

ON THE HEELS of Australia’s recent move to ban those under the age of 16 from social media, our government is looking into protections against online harm that could be implemented in Ireland.

We have already seen many other attempts to protect society from the harms of social media.

In Europe, we have attempted to clamp down on disinformation shared on social media through the regulation of moderation for the largest social media networks with our Digital Services Act.

Countries such as China have created their own social media companies while banning foreign competitors, in order to ensure that their government maintain tight control of the message and culture.

But foundational to all of these attempts to protect people is a restriction on individual speech and behaviour, which inevitably causes significant backlash among sections of the population who condemn these measures as political censorship and a restriction of individual freedom, playing into fears of a loss of control.

These tech barons are counting on many elected representatives being older and thus being, the companies assume, less tech-savvy, which they see as an advantage for their tech lobbying efforts.

This makes it all the more important for those of us who have grown up entirely in the digital age to consistently highlight the way that these toxic systems on social media function, and specifically to identify the many harms that recommender algorithms cause.

The introduction of recommender systems on social media is likely the single most harmful change we’ve experienced globally in the past 10 years.

That may sound hyperbolic to someone who is hearing the concept for the first time, and this abstract change is a lot less politically sexy than the much more visible and tangible crises we currently face, such as the housing crisis or climate crisis. But it is interlinked with every single issue we’ve experienced in recent years, and with the very functioning of our democracy itself.

Who chooses what we see online?

Until about 2014, social media companies empowered users to choose what we wanted to see. We would connect with or follow friends or those who made content we wanted to see, and all the content we chose to see would be visible on a chronological feed, displaying the most recent content first.

It was a silly and somewhat whimsical place compared to today, where people often posted inane photos of their breakfast and silly early memes. Algorithms such as recommender systems were limited to recommending friends to follow.

Then, in a huge scandal which has since been widely forgotten, Facebook revealed that they had manipulated the order of content on the feeds of 700,000 users without their knowledge, in order to experiment on which emotions maximised engagement.

Ultimately, they realised that they could maximise engagement (which in turn maximised advertising revenue and profits) by using recommender systems to artificially amplify content which gave users feelings of hate (and incongruously, feelings of cuteness), while burying the rest.

Since 2014, we have seen an explosion of hate across the entire world. Far-right and other hateful reactionary parties and organisations are gaining ground in almost every country worldwide, regardless of each country’s individual levels of immigration, inequality, cost of living, diversity, progressivism, or any other often-highlighted individual cause of a growth in hate.

Promoting hatred for clicks

And no wonder; despite the victim complex and anti-establishment message that these extremist far-right actors cling to, they have the most powerful propaganda machine the world has ever seen, as well as the world’s richest man and the most powerful leader, Elon Musk and Donald Trump respectively, on their side.

While there have been countless examples of propagandistic media throughout history, never before have we had billions of people being fed hours upon hours of carefully curated hateful content each day.

This is made even more powerful by the illusion of open democratic debate and free speech perpetuated by these recommender systems; unlike previous eras where people could point to specific journalists and media outlets who spread propaganda as having a biased agenda, these recommender systems repeatedly highlight the 1% of extreme content from ‘ordinary people’ while hiding the rest, which makes people much more vulnerable and uncritical to what is being said and shared than if it came from traditional institutions or organisations, as these extreme individuals are presented by the very design of the system as being ‘the voice of the people’.

On an individual level, this attempt to maximise engagement has also caused addictive tendencies to skyrocket, as well as making people feel more isolated, lonely, anxious, unconfident in themselves, and depressed as they are increasingly affected by extreme levels of social comparison.

man-lying-on-the-couch-and-scrolling-and-reading-messages-on-his-smartphone-narrow-depth-of-field Recommender systems on social media often promote hateful content designed to attract clicks. Alamy Stock Photo Alamy Stock Photo

So, knowing the pervasive harm caused by social media as currently designed, what do we do now?

Banning social media for young people will ignore the incredibly harmful societal effects of modern social media for most of the population, and banning social media entirely seems both unreasonable and impossible, even in the face of the grave threats caused by social media as currently-designed.

The most immediate solution is to ban companies from using recommender systems on social media entirely (outside a few specific cases like recommended friends); that would restore our freedom to choose what we see online, and at least pause our descent into the years-long spiral towards increased extremism, misinformation, social media addiction, and polarisation.

Ireland’s role

In Ireland, we have a unique role in this, due to the EU headquartering of the US tech multinationals here and the importance placed on our online media regulator Coimisiún na Meán to regulate these companies in Europe.

But this also means that social media companies have much to lose if our government finally decides to prioritise safety, mental health, and democracy over the profits of these tech titans; so expect intense lobbying by these companies in order to protect their immense profits at all costs, if the idea of banning recommender systems truly enters the mainstream.

Banning recommender systems on social media will only pause the feedback loop which has in the past decade led us to the toxic culture we exist within today.

We must also take steps to begin to build back towards a more hopeful, democratic, and cohesive society.

Some key issues I’ve mentioned before which have caused mass dissatisfaction – inequality and static or declining quality of life for large swathes of people – will continue to exist and tackling these issues directly will ensure that there is less discontent to allow the far-right and other hateful actors to take advantage of in the first place.

We must also promote more positive alternatives to Facebook, Instagram, X, Tiktok, etc. The pure profit motive of these companies is squarely to blame for them prioritising their bottom line over the potential harm caused to billions of people.

We need to promote non-profit, decentralised alternatives such as the non-profit German social media network Mastodon, as well as other public, non-profit, and democratic alternatives which focus on building community, connecting people, and supporting culture and community, rather than maximising profit at all costs.

All of the above may seem jarring to someone unacquainted with the specifics of how algorithms and other related technology and design influence behaviour; it may be hard to believe that a few lines of code written by a developer in an office in Silicon Valley can truly be a major cause of increased misogyny, homophobia, transphobia, racism, addictive tendencies, loneliness, anxiety, depression, eating disorders, distrust in democracy and institutions, or a decline in social cohesion and community.

But that lack of awareness in how these systems influence our behaviour is exactly why they have been so incredibly pervasive over the past decade, and why we must stop them from making things worse before our very democracy and modern society crosses the point of no return.

Killian Mangan is a computer game designer from Waterford who is interested in urban design, accessibility, housing and climate. He is an independent who ran in the recent local and general elections on a platform to Democratise-Decentralise-Decarbonise.

Readers like you are keeping these stories free for everyone...
A mix of advertising and supporting contributions helps keep paywalls away from valuable information like this article. Over 5,000 readers like you have already stepped up and support us with a monthly payment or a once-off donation.

Close
23 Comments
This is YOUR comments community. Stay civil, stay constructive, stay on topic. Please familiarise yourself with our comments policy here before taking part.
Leave a Comment
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.

    Leave a commentcancel

     
    JournalTv
    News in 60 seconds