Skip to content
This site uses cookies. By continuing to browse, you agree to the use of cookies. You can change your settings or learn more here.
OK
Image: Shutterstock/Mongkol_Chuewong

Abuse on Twitter: Women of colour 34% more likely than white women to be targeted

The study found that an abusive or problematic tweet was sent to the women in the study every 30 seconds on average.
Dec 18th 2018, 6:15 AM 5,263 22

ONE IN TEN tweets mentioning black women politicians and journalists was abusive or problematic, compared to one in 15 for white women, according to a large-scale study of abuse on Twitter.

The study also showed that black women were disproportionately targeted, being 84% more likely than white women to be mentioned in abusive or problematic tweets.

Amnesty International has today released a study into abuse against women on Twitter –  conducted with Element AI, a global artificial intelligence software product company.

More than 6,500 volunteers from 150 countries signed up to take part in Troll Patrol, a crowdsourcing project designed to process large-scale data about online abuse. Volunteers sorted through 228,000 tweets sent to 778 women politicians and journalists in the UK and USA in 2017.

The study stemmed from calls by Amnesty International for Twitter to publish data on the scale and nature of abuse from users, which the company has so far not yet done.

Around 1.1 million abusive or problematic tweets were sent to the women in the study across the year – or one every 30 seconds on average.

Milena Marin, senior advisor for tactical research at Amnesty International said that the study meant “…we have the data to back up what women have long been telling us – that Twitter is a place where racism, misogyny and homophobia are allowed to flourish basically unchecked”.

“We found that, although abuse is targeted at women across the political spectrum, women of colour were much more likely to be impacted, and black women are disproportionately targeted.

Twitter’s failure to crack down on this problem means it is contributing to the silencing of already marginalised voices.

Results

Element AI used the data gathered in Troll Patrol to develop a machine-learning model of an app which attempts to automatically detect abusive tweets. 

Politicians included in the sample came from across the US and UK political spectrums. The journalists included were from a diverse range of US and UK publications including The Daily Mail, The New York Times, The Guardian, The Sun, GalDem, Pink News and Breitbart.

Among the results were that 7.1% of tweets sent to the 778 women in the study over the course of a year were problematic or abusive, which is around one every 30 seconds.

Women of colour, (black, Asian, Latinx and mixed-race women) were 34% more likely to be mentioned in abusive or problematic tweets than white women.

Female politicians and journalists faced similar levels of online abuse, and researchers observed both liberals and conservatives alike, as well as left and right leaning media organisations, being affected.

You can find out more about the study here.

Send a tip to the author

Gráinne Ní Aodha

COMMENTS (22)

This is YOUR comments community. Stay civil, stay constructive, stay on topic. Please familiarise yourself with our comments policy here before taking part.
write a comment

    Leave a comment

     
    cancel reply
    Back to top