Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Alamy Stock Photo
Censorship

Rights watchdog alarmed at 'arbitrary removal' of Irish social media posts about Palestine

The intervention has come on foot of incidents including Basketball Ireland’s Instagram temporarily disabled and a newspaper seeing its advert account blocked.

A HUMAN RIGHTS watchdog has expressed alarm at what it believes is “arbitrary removal” of social media posts by Irish companies and organisations, seemingly in connection with Israel’s war on Gaza.

The Irish Council for Civil Liberties (ICCL) said it was very concerned at the practice which it warned could have a “lasting chilling effect” on public debate if social media companies aren’t held accountable.

Meta, the owner of Facebook and Instagram, both of which have seen complaints levelled against it for alleged censorship of Palestinian content, told The Journal that it rejects any claim about it “systemically suppressing” such voices on social media.

However, the company admitted that it has makes “errors” in moderating content which it accepted are frustrating for users.

The ICCL’s intervention is on foot of incidents including Basketball Ireland’s Instagram getting temporarily disabled, which came amid a furore over the women’s national team’s facing off against Israel.

A Dublin newspaper also saw its advert account with Facebook temporarily suspended after it sought to boost coverage of Palestine-related debates in Dublin City Council.

International examples

Recent months have seen global organisations such as Human Rights Watch highlight similar concern over “systemic censorship” related to posts about the war expressing sympathy with Palestinians and against Israel or its army.

In a 51-page report published in late December, called ‘Meta’s Broken Promises’, Human Rights Watch documented “flawed” moderation of content on Facebook and Instagram.

It accused the companies of operating content moderation policies and systems that “increasingly silenced voices in support of Palestine” in the wake of the conflict between Israeli forces and Hamas.

Now, the ICCL’s surveillance and human rights senior policy officer Olga Cronin has repeated these worries about the level of transparency in how social media companies take the decision to remove or block content.

Cronin told The Journal that the ICCL was very concerned by the “arbitrary removal” of content of “Irish and other accounts by social media companies, seemingly in connection with posts relating to Israel and Gaza”.

Cronin also pointed to a recent publication by non-profit digital rights org Access Now which questioned whether there was “over-moderation” of Palestinian content by Meta.

“Disabling, banning or suspending content, especially content airing political views and, in particular, from a journalistic outlet, is a severe infringement of the right to freedom of expression,” Cronin said.

In the long term, curbing these types of views without clear explanation could have a lasting chilling effect on public discourse.

Cronin said that the ICCL will continue to advocate for Big Tech companies to respect people’s rights to freedom of expression, privacy and protection of personal data.

“They should be radically transparent about their content moderation policies, mechanisms and decisions,” she said.

“At a bare minimum, they should be fully disclosing the rules and mechanisms they use to moderate content,” Cronin added.

This would mean explaining in detail why a post was taken down and how the company’s own rules were applied; it would also mean explaining how someone can appeal those processes and how the company can be held to account for wrongful takedown.

Meta response

When contacted, Meta said that it maintains guidelines in its Community Standards document which it said outlines publicly what is and is not allowed on its platforms.

It added that it publishes detailed information on its Transparency Centre about how it enforces those policies.

“We acknowledge that we make errors that can be frustrating for people, but the implication that we deliberately and systemically suppress a particular voice is false,” a Meta spokesperson said.

This admission that Meta was making errors in removing content related to the war was reiterated in a continuously updated statement on its website covering how its platforms are responding to Israel’s invasion of Gaza.

“We want to reiterate that our policies are designed to give everyone a voice while keeping people safe on our apps,” the statement about the ongoing war reads.

“We apply these policies regardless of who is posting or their personal beliefs, and it is never our intention to suppress a particular community or point of view. Given the higher volumes of content being reported to us, we know content that doesn’t violate our policies may be removed in error.”

Ireland-based incidents

Last week, Basketball Ireland saw its Instagram temporarily disabled.

The organisation noted this came after its senior women’s team became the subject of “added media and social media attention” relating to a recent game between it and Israel.

The game became a political flashpoint after the Israeli team was photographed with armed soldiers and there were calls for Ireland to boycott the game despite the heavy fines it would have incurred.

After complaints by Basketball Ireland, its Instagram account was reactivated by Meta.

Meanwhile, local independent newspaper the Dublin Inquirer found its adverts account suspended by Meta earlier this month after it sought to boost coverage of Palestine-related debates in Dublin City Council. It has since been restored.

The paper’s co-founder and deputy editor Sam Tranum expanded on the matter on X, formerly Twitter.

He said that, at first, he assumed it was a coincidence, before finding out that reports have been published by organisations such as Human Rights Watch about the “phenomenon”.