We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Adam Peck/PA Wire/Press Association Images

Column If you read something offensive, don’t retweet it, report it

The future of the internet depends on social networks like Facebook and Twitter developing better protocols for dealing with offensive and potentially criminal postings, writes Dr Ciarán Mc Mahon.

IN MY LAST column, I was liberal in my reference to the types of characters who make the internet an unpleasant place. Call them trolls, call them assholes, call them whatever you may: but what exactly should be done about them?

Recently in the UK, there have been cases centered around people posting nasty things on social networks. The question is then, if our cousins across the water are attempting to stamp out this sort of behaviour with such court cases, then should we follow?

But let’s get real here. Right now, as New York deals with the aftermath of a superstorm, I saw a tweet saying ‘how funny it would be if the new World Trade Centre got blew over’ [sic]. Not exactly ‘grossly offensive’ nor illegal, but 9/11 victims might think differently.

That’s a problem of these cases – something being offensive depends on the material being seen by someone capable of being offended by it. So the genius who just tweeted that witticism will probably get away it, given that New Yorkers are kind of busy at the moment and no one has retweeted it.

The actual content of that example is besides the point though, only that we know what was tweeted – does anyone know what Liam Stacey or Mathew Woods actually said? Notably, Victoria Coren of the Guardian went through the former’s case and found it very difficult to find out the exact content which was criminal.

Similarly, I’ve found it hard in the case of the latter: which is disturbing, for a number of reasons. If we don’t want repeats of this behaviour, then surely it should be clear what someone is being criminalised over. Perhaps the networks in question are afraid of drawing attention to it, but a case study or two wouldn’t go amiss.

‘A tree falling in the woods’

More to the point, the reasons why these cases come to the attention of the law is because they come to the attention of a lot of other people, through retweets and shares.

Not only does this highlight how strange it is that we don’t know what offensive material actually made criminals of the two young gentlemen above, but it also highlights the ridiculousness of fiascos like these – this is a ‘tree-falling-in-the-woods-making-no-noise’ scenario. Perhaps I should get all my offensive tweets out late at night, when no-one notices them?

In fact, it seems that many of these cases have come to the attention of the authorities simply because a ‘celebrity’ with a high number of followers happened to notice and highlight it. But there is an important point here – why does anyone retweet such material? Sharing only increases the likelihood that someone who could genuinely be hurt by it would see it.

There is something quite disingenuous about this type of behaviour, especially when framed as some kind of moral outrage, being offended on behalf of some other, imagined person and publicising so that they can be ‘brought to justice’.

While we understand society’s rules and laws, perhaps we secretly want an excuse to break them, and that in such instances, we take the opportunity to say something offensive under the cover of someone else having said it first.

It’s entirely hypocritical as it brings more publicity to the act itself – which is why the UK’s Director of Public Prosecutions has recently reminded the public that retweeting an offensive tweet is just criminal as creating one in the first place. Of course, no-one has yet been prosecuted as such, but the point is simple: if you read something offensive, don’t retweet it, report it.

More action from Facebook and Twitter

Again – don’t hate the troll, hate the bridge – our focus should also be on the technology that allows such behaviour to fester, rather than the nasty behaviour itself, which is hardly unusual.

Service providers like Facebook and Twitter need to develop far more sophisticated and transparent protocols for dealing with this sort of content as, legalese notwithstanding, they are morally responsible for it.

When you look through the Help sections of both these sites with regard to offensive content, neither goes into much more detail than ‘we take it seriously’ which is not good enough – there needs to be a clear process chart, ethical principles, escalation procedures and so on.

It has been said that these cases have an element of ‘mob justice’ to them, with hordes of people clamouring for some kind of justice, and there is certainly an element of truth to that characterisation.

At the same time, it should be remembered that this is still a very new medium of communication and that ‘mob justice’ is often the first stage in the history of justice systems.

This again puts a large onus on the social media giants to develop the type of clear and ethical systems I’ve described. A considerable amount of work, admittedly, but one on which the future of the internet depends.

While there is clearly a benefit for them in reducing the amount of negative press they receive in these cases, and there is also a ‘regulate yourselves or be regulated by government’ imperative, I’d like to think the old-fashioned notion of moral obligation is also important.

I haven’t mentioned, nor have I forgotten, another such notion often mentioned in these cases: that of freedom of speech. That’ll have to wait to next week.

Dr Ciarán Mc Mahon is a psychologist and researcher in politics and social media. He blogs both at and and can be followed @cjamcmahon

Readers like you are keeping these stories free for everyone...
A mix of advertising and supporting contributions helps keep paywalls away from valuable information like this article. Over 5,000 readers like you have already stepped up and support us with a monthly payment or a once-off donation.

Your Voice
Readers Comments
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.