Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

(File image) A content warning appearing on a post on social media. Alamy Stock Photo
Social Media

Ireland's new media regulator explains how his team will tackle the most harmful online content

Coimisúin na Meán intervened to demand the removal of an extremely violent video from social media last week.

A VIDEO OF an extremely violent incident was recently removed from social media platforms following the intervention of Coimisiún na Meán, which stepped in to demand that companies ensure it was taken down.

The new online and broadcast media regulator, which is still being set up, is permitted to ask platforms to remove any illegal, violent or harmful content from their websites under the Online Safety and Media Regulation Act enacted last year.

The commission’s executive chairperson, Jeremy Godfrey, explained to The Journal how this will work.

The new law requires the commission to address content – including user-generated content – which is deemed harmful or illegal on any and all channels of broadcasting in the country.

Godfrey said his team has only stepped in during “very urgent situations” so far, where there is “real continuing harm”.

The team is still in the process of establishing an “institutional” process, he added.

Godfrey said: “If we think there’s very significant, ongoing danger, we’ll informally speak to our contacts in the platforms, and that’s really how it’s worked.”

Godfrey said in the few cases so far, everybody concerned has been “on the same side” and there has been good cooperation from the companies to remove harmful content.

He added that the commission usually approaches the “largest and most significant platforms” first, as this is where the majority of the public are most likely to engage with the content.

Once the commission has flagged the dangerous posts to the companies, the platforms use “their own internal ways” of dealing with the removal. They are required to keep a watchful eye for reposting.

Under new regulations and laws, introduced this year and last year on a domestic and European level, it falls to the platforms themselves to make sure the content doesn’t make it on the site in the first place.

“We’re not next year – even when we are fully operational – going to be the first port of call for every piece of harmful content people see,” Godfrey said.

“The first port of call needs to be the platforms themselves and they will have legally binding obligations and we’ll be enforcing those obligations.

In general, it’s the platform’s responsibility to keep their platform safe, that is the whole basis of the new regulations.”

On some occasions, the commission is the first to contact the companies on a potentially harmful or illegal post that must be removed. In other cases, however, users of the platforms might be first to draw their attention to the posts.

The chairperson said that within the next year, as both the commission and the companies become more familiar with the current legislation, they will look at the possibility of taking on individual complaints themselves.

European rules

The EU’s new Digital Services Act 2023 (DSA) requires online platforms with over 45 million active monthly users to place a particular focus on moderating harmful and abusive content.

The DSA regulations became legally enforceable in August and EU member states are establishing and appointing their own agencies to enforce them.

Member states have until February to appoint “Digital Service Coordinators” to enforce the new rules.

Coimisiún na Meán will be in charge of enforcing the regulations here and handling Irish people’s complaints under the new EU rules.

Godfrey said so far, the commission has not yet encountered a situation where online companies have resisted removing harmful posts when asked to do so under Ireland’s legislation. 

He hopes that the new EU legislation will make it even clearer what is required from companies and strengthen cooperation with his agency.

He said the framework provided by the EU law means companies will have a “consistent way of responding” meaning “everyone knows what to expect”.

Dark place

Earlier this week, Minister for Justice Helen McEntee described social media as a “dark place” at times, after fake videos were created and posted of her and Taoiseach Leo Varadkar being shot.

Asked about this development, reported in the Irish Daily Mail, McEntee said that social media can be a “dark place”.

She said it was important for the public to understand “what they say and do on social media does have an impact in the same way as if they were to say and do it in person”.

Godfrey said: “The DSA places some some obligations on the very large platforms, and that includes that they should assess risks that their platforms pose to things like democratic discourse, electoral integrity, public health, public safety.”

Under the DSA, the companies must then assess the risks and have mitigation tactics against those risks, he added.

Last week, European speakers of parliament called for better protections against politicians receiving abuse both online and offline.

This included stricter regulation to be placed in regards to the monitoring of online platforms for extremism and radicalisation.

Speaking to The Journal in Dublin last week, Speaker of the UK House of Commons Lindsay Hoyle said social media’s role in the January 6 2021 Capitol Hill riots in Washington, D.C. was a “wake up call” for all international parliaments.