Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Shutterstock/asiandelight
VOICES

Hate Speech law 'As legislators, it is our responsibility to draw the line on hate'

Malcolm Byrne says the ‘Hate Speech’ legislation is attracting attention and asks how legislators can strike the balance on this issue.

THE CRIMINAL JUSTICE (Incitement to Violence or Hatred and Hate Offences) Bill, 2022, attracted limited comment when it passed through the Dáil last year and was carried at the final stage by 110 votes to 14.

Yet, it has been highlighted in recent days following the heavy defeats of the two referendums on family and care. Some of those who opposed the referendums cite the Hate Speech legislation as the latest element of the ‘woke agenda’ that they wish to oppose. Others, including those in government parties, feel that the legislation should not be a priority as we move into the final year of this administration.

The closer we move toward elections, politicians will always be nervous about engaging in areas that may appear controversial. But just because we face political contests does not mean we should shy away from debate – the underlying issues that pointed to the need for this legislation will not go away.

Why we need this legislation

The desire to legislate in this area does not come from nowhere. It comes from very genuine concerns where we have seen individuals targeted because of their race, ethnicity, sexual orientation, religion and other grounds related to their perceived or actual identity.

We already have the 1989 Prohibition of Incitement to Hatred Act, but that legislation has been found not to be sufficiently effective and it is from a different era, long before an increasingly polarised online public space has emerged that facilitates anger directed against individuals and groups because of whom they happen to be.

There are legitimate concerns that freedom of expression could be hugely damaged if the State seeks to intervene too far. So what principles should we use as legislators if we choose to regulate speech and expression? What might guide us?

What about these:

“You may not threaten, incite, glorify, or express desire for violence or harm.”

“You can’t affiliate with or promote the activities of violent or hateful entities.”

“You may not share abusive content, engage in the targeted harassment of someone, or incite other people to do so.”

“You may not attack other people on the basis of race, ethnicity, national origin, caste, sexual orientation, gender, gender identity, religious affiliation, age, disability or serious disease.”

These are not Sections of the government’s Hate Speech legislation. They are, in fact, the “Rules of X”, the community standards of the platform formerly known as Twitter. Drawn up by people who work for one of the world’s wealthiest men, they set the limits on free speech on his platform with no recourse to legislators and determine how these rules are enforced.

Although he never actually said or wrote it, the phrase attributed to Voltaire is often trotted out when debates arise around Free Speech,

“I disapprove of what you say, but I will defend to the death your right to say it”,

Whether in Revolutionary France or contemporary democracies, it remains a point of contention as to where the line should be drawn, if one is to be drawn at all, between ensuring that vital right of freedom of expression (no matter how horrible or horrific the views expressed) and protecting other rights, such as privacy, freedom from harassment and incitement to violence and to avoid being targeted because of one’s identity.

The right to say it

The debate around ‘freedom of expression’ is forming a central element of the so-called culture wars. When and where is it appropriate to regulate what people can say? What constitutes ‘Hate Speech’? How do we allow for awful but lawful content online, but still try to ensure safe spaces and battle polarisation?

As a member of the Oireachtas Media Committee, these were among the questions I and colleagues considered as we examined in detail what became the Online Safety and Media Regulation Act. The issues are now resurfacing in the context of the deliberations on Hate Speech legislation.

Arguably some of the best legal views on these questions were considered by US Supreme Court Justice Oliver Wendell Holmes. Holmes was a civil libertarian and was a strong defender of the American First Amendment. He tried to draw the line between what should be classified as ‘protected’ and ‘unprotected’ speech.

In a case in 1919, he set out the ‘clear and present danger’ test that needed to be considered by lawmakers.

The question in every case is whether the words used are used in such circumstances and are of such a nature as to create a clear and present danger that they will bring about the substantive evils that Congress has a right to prevent. It is a question of proximity and degree.

He used the example of someone running into a crowded theatre and falsely shouting ‘Fire!’. There are clear and dangerous consequences of such an action.

The objective of any democratic society should be to protect the principle of the ‘freedom of speech’ not speech itself. To ensure freedom of speech requires that there must be guardrails. As Jamie Susskind points out in his book, ‘The Digital Republic’ (2022), the approach across Europe has been to place on positive duty on governments to ensure that free expression can be enjoyed safely.

Article 10 (1) of the European Convention on Human Rights (ECHR) reads:

“1. Everyone has the right to freedom of expression. This right shall include freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers. This article shall not prevent States from requiring the licensing of broadcasting, television or cinema enterprises.”

This is a very important assertion that must be strongly defended but is balanced by Article 10 (2) that rightly points out that where there are rights, there must also be responsibilities and respect for the rights of others.

“2. The exercise of these freedoms, since it carries with it duties and responsibilities, may be subject to such formalities, conditions, restrictions or penalties as are prescribed by law and are necessary in a democratic society, in the interests of national security, territorial integrity or public safety, for the prevention of disorder or crime, for the protection of health or morals, for the protection of the reputation or rights of others, for preventing the disclosure of information received in confidence, or for maintaining the authority and impartiality of the judiciary.”

The ECHR clearly envisions, therefore, that some limits must be in place.

The development of social media and online platforms has had many incredible positives. It has helped democratise the sharing of news and ideas and to set out your views, it is not essential to get past the gatekeepers of opinion page editors such as the one who kindly decided that this piece was interesting enough to run here.

The increase in the quantity of opinions offered is generally not matched in quality. True news values such as factchecking are often sacrificed and good journalism is often restricted behind a paywall while you can get any and all opinions, informed or not, usually for free. While the online world can provide a forum for informed debate from those with an expertise or interest that we may not often hear, equally it can allow amplification of the views of those who are ill-informed or to spread misinformation.

The tech giants

Most social media platforms (Facebook / X (Twitter) / TikTok etc) simply allow users to upload content and then rely on other members of the ‘community’ to report the content if it goes against the ‘community standards’ that have been designed by the company.

The ‘Community Standards/Hate Speech’ set of rules operates on all such platforms. The rules are not determined by elected legislators (save where there is a requirement by a State on illegal content) but by the billionaire owners of such platforms and those who work for them.

This is the self-regulated approach to hate speech currently undertaken by the tech giants.

Mark Zuckerberg of Meta has said that about 95% of reported content or hate speech on Facebook is taken down by Artificial Intelligence. About 6.5 million reports are generated every week. The company still has about 15,000 content moderators globally. This figure was confirmed at the Oireachtas Media Committee recently where Meta representatives indicated that AI is used 90 to 95% of the time to remove content that goes against the platform’s rules on speech (the ‘community standards’).

portrait-of-elon-musk-and-mark-zuckerberg-glitch-effect-elon-musk-vs-mark-zuckerberg Portrait of Elon Musk and Mark Zuckerberg Glitch Effect. Alamy Stock Photo Alamy Stock Photo

Representatives of X also stated that over 90% of content was removed by AI though alarmingly the number of human content moderators overseeing this process has been slashed globally from 5,500 up until November 2022 when Elon Musk took over to 2,500 today.

Elon Musk presents himself as a defender of Free Speech and is often cited by those who argue against restrictions as the great liberator on this issue. But Musk himself has placed limits on freedom of expression and only this week we saw this play out in the courts where his own limits were laid bare. 

When Musk was threatened by major advertisers that they would pull advertising on X, he demanded greater levels of content moderation; employees at his companies must sign non-disparagement and non-disclosure agreements, and there have been cases where X has suspended the accounts of anyone who criticises Musk.

In 2007, Facebook entered into a settlement with the State of New York around the company’s alleged failure to protect children online, particularly with regard to access to pornography. Part of the agreement involved Facebook responding to and addressing complaints about nudity or pornography, harassment or unwelcome contact within 24 hours. The company’s actions would be independently verified by the State.

More generally, in the United States Courts, it has often been held that pornography is protected by First Amendment Free Speech rights whereas ‘obscenity’ is not. The question again is: where do we draw the line? (And who draws it?) Cultural attitudes here obviously influence one’s approach. The attitude to nudity or limited clothing, for example, would differ enormously from the beaches of Brazil or the Mediterranean to societies with strong religious disapproval of such choices.

There are 19 countries around the world that ban Holocaust denial. Engaging in speech that promotes this lie is a punishable offence. In Germany, understandably, this issue is taken particularly seriously and inciting hatred against any “national, racial, religious group or a group defined by their ethnic origins, against segments of the population or individuals because of their belonging to one of the aforementioned groups or segments of the population or calls for violent or arbitrary measures against them” could lead to imprisonment on conviction.

In 2017, Mark Zuckerberg, who is himself Jewish, defended the rights of Holocaust deniers but by 2020 had stated that his “thinking had evolved” because of evidence that social media postings were leading to an increase in anti-Semitic violence. Meta’s “Hate Speech Policy” now prohibits any content that denies or distorts the Holocaust.

Responsibility as legislators

As private sector platforms debate the limits of free speech internally, we as legislators face similar challenges but our responsibilities are much greater. While Musk and Zuckerberg and others are beholden to their shareholders, legislators and regulators need to think about citizens and society.

How do we get the balance right with competing rights?

In an increasingly polarised political environment (exacerbated by the social media companies), attempts to have a civilised debate can be difficult.

Pim Fortuyn was a socially liberal but anti-immigration Dutch politician who founded a political party in 2002 that came second in the General Election in the Netherlands that year. Fortuyn was gay. He was strongly critical of Islam viewing it as a threat to the Dutch way of life. But he defended the right to free speech of homophobic Islamic leaders, albeit with an important caveat… “An imam should be able to say that homosexuals are worse than pigs. My only demand is that you mustn’t incite violence”. Fortuyn was shot dead nine days before the election. The Dutch politician drew the line where there was a threat of violence.

Meta indicated when it came before the Oireachtas Media Committee that it would hold a similar position. It used the example that where somebody posted “I hate Emmanuel Macron and he is a useless politician” that such would not constitute a breach of community standards, whereas if somebody stated that the French President “deserves to be shot and I encourage people to do it”, such a statement would be deemed in breach.

While blasphemy was removed from the Constitution in 2018 (who remembers that referendum?), there is always a tension where freedom of expression is seen as insulting to deeply held beliefs. This has been evident with some in the Islamic tradition in instances such as the response to Salman Rushdie’s ‘The Satanic Verses’ or to the cartoons of Mohammed published in Denmark or by French magazine, Charlie Hebdo. Equally though, Christians are often offended by pieces of art that are perceived to be insulting to Christ.

Racist speech is still sadly too common in Irish society. In April last year, the outstanding Wexford sportsman, Lee Chin, was racially abused from the sidelines at a charity hurling match in Tipperary. The GAA handed down a 48 week ban to the spectator who shouted the abuse. Was this a justifiable restriction on that spectator’s ‘right’ to freedom of speech? Should individuals have a right to scream racist (or sexist or homophobic) abuse at a game or indeed anywhere else? This is not just a challenge for the GAA and other sporting codes have also acknowledged the problem.

Interestingly, at the Oireachtas Sports Committee, the GAA supported the introduction of Hate Speech legislation precisely to deal with this sort of behaviour.

In this article, I hope that I have outlined some of the issues that legislators and regulators face in trying to balance freedom of speech with other rights such as personal safety, human dignity and social cohesion. Self-regulation already takes place by the online platforms and in other spheres.

There are valid concerns around some of the definitions in the Hate Speech legislation. It is important that any laws are clear and enforceable and that they achieve their intended purpose as well as avoiding any unintended consequences. These concerns must be addressed in the context of the next stage of the debate on this law. However, we have to maintain a focus on the purpose of these laws: to tackle the growing level of incitement to violence as a result of hateful intolerance.

I would hope that in considering these issues we can have a reasoned and evidence based debate (as I am sure there will be in the Comments section below) and that we can determine where we need to draw the line.

Malcolm Byrne is a Fianna Fáil Senator and member of the Oireachtas Media Committee.

Your Voice
Readers Comments
80
This is YOUR comments community. Stay civil, stay constructive, stay on topic. Please familiarise yourself with our comments policy here before taking part.
Leave a Comment
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.

    Leave a commentcancel