Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Content featuring influencer Andrew Tate, who has been charged with rape, human trafficking, and forming an organised crime group was found to be highly recommended. Alamy Stock Photo
algorithms

TikTok and YouTube Shorts feeding male users misogynistic content, Irish research shows

The researchers tracked the content recommended content being fed to ten ‘sockpuppet’ accounts that identified as male on TikTok and Youtube Shorts.

AN EXPERIMENT CONDUCTED by Dublin City University has found that the algorithms of some social media platforms are feeding male-identified accounts misogynistic and anti-feminist content. 

One of the researchers behind the project, Professor Debbie Ging, said it shows that shutting down the accounts of ‘manfluencers’ like Andrew Tate doesn’t necessarily mean their content is being removed from platforms,  so social media firms need to tackle harmful content in “more sophisticated ways.”

Professors from the university’s Anti-Bullying Centre tracked the content apps recommended to ten ‘sockpuppet’ accounts on 10 blank smartphones, five on TikTok and five on Youtube Shorts. 

The researchers found that all of the male-identified accounts were fed male supremacist, anti-feminist or other extremist content, whether they sought out that kind of content or not. 

Each account received this content within the first 23 minutes of the experiment. 

The researchers found that once the account showed interest by watching this kind of content, the amount it was recommended “rapidly increased”. 

By the last round of the experiment, the study states, after the accounts (which had identified as being owned by a man during set-up) had watched 400 videos over 2-3 hours, the vast majority of the content being recommended was “toxic”, primarily falling into the alpha male and anti-feminist category. 

In the last stage of the experiment, this was the case for 76% of the videos recommended by TikTok, and 78% of the videos recommended on Youtube Shorts. 

“Much of this content rails against equality and promotes the submission of women. There was also a large amount of content devoted to male motivation, money-making and mental health,” the researchers behind the experiment,  Professor Debbie Ging, Dr Catherine Baker and Dr Maja Andreasen said. 

They said that the mental health related content is particularly “dangerous”, as it often depicts depression as a “sign of weakness”, and claims that therapy is “ineffective”. 

“The other toxic categories were reactionary right and conspiracy, which accounted for 13.6% of recommended content on TikTok and 5.2% of recommended content on YouTube Shorts. Much of this was anti-transgender content,” the researchers stated. 

Overall, the experiment found that Youtube Shorts accounts were recommended a larger amount of toxic content (61.5% of the total recommended content) than TikTok (34.7%). 

It further found that content featuring ‘Manfluencers’ accounted for the majority of the recommended videos in the toxic category, the most popular of these being Andrew Tate, who featured 582 times in the Youtube Shorts content, and 93 times in the TikTok content. 

Professor Ging said that the findings of this report are concerning for parents, teachers, policy makers, and society as a whole. 

She and the other researchers recommended better content moderation on social media platforms, turning off recommender algorithms by default and cooperation with trusted flaggers to highlight illegal and harmful content. 

They also recommended that teachers need to be informing young people on influencer culture and how algorithms work. 

“Ultimately, girls and women are the most severely impacted by these beliefs, but they are also damaging to the boys and men who consume them, especially in relation to mental wellbeing. The social media companies must come under increased pressure from the government to prioritise the safety and wellbeing of young people over profit,” professor Ging added.

***

Ever wondered how disinformation spreads so rapidly – or who is behind it? Check out our new FactCheck Knowledge Bank for essential reads and guides to finding good information online.