Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Facebook’s Meta logo sign at the company's headquarters in California. Tony Avelar/AP
Meta

Meta reviewing its oversight board's call to make adult nudity policies more inclusive

It came after the tech giant removed Instagram posts showing transgender and non-binary people with their chests bared.

META IS REVIEWING a call by its oversight board to make its adult nudity policies more inclusive after the tech giant removed two Instagram posts showing transgender and non-binary people with their chests bared.

Neither post violated Meta’s policies on adult nudity, and in a statement released earlier this week, the board said it had overturned the company’s decision to remove them.

A Meta spokesperson told AFP that the company welcomed the board’s move and had already restored the images, agreeing they should not have been taken down.

But the board also seized the opportunity to call on Meta, which also owns Facebook, to make its broader policies on adult nudity more inclusive.

The board counts a range of lawyers, retired judges and the former prime minister of Denmark Helle Thorning-Schmidt among its members.

The current policy “prohibits images containing female nipples other than in specified circumstances, such as breastfeeding and gender confirmation surgery,” the oversight board wrote in its decision.

That policy, it continued, “is based on a binary view of gender and a distinction between male and female bodies,” and results in “greater barriers to expression for women, trans and gender non-binary people on its platforms.”

It called for Meta to evaluate its policies “so that all people are treated in a manner consistent with international human rights standards, without discrimination on the basis of sex or gender.”

The Meta spokesperson said the company was reviewing that request, which echoes calls made by activists for years.

“We are constantly evaluating our policies to help make our platforms safer for everyone,” the spokesperson said.

“We know more can be done to support the LGBTQ+ community, and that means working with experts and LGBTQ+ advocacy organizations on a range of issues and product improvements.”

“We have given Meta food for thought,” oversight board member Thorning-Schmidt said yesterday in a forum at Instagram.

“It’s interesting to note that the only nipples not sexualized are those of men or those who have been operated on.”

“Over-policing of LGBTQ content, and especially trans and nonbinary content, is a serious problem on social media platforms,” a spokesperson for advocacy group GLAAD told AFP.

“The fact that Instagram’s AI system and human moderators repeatedly identified these posts as pornographic and as sexual solicitation indicates serious failures with regard to both their machine learning systems and moderator training.”

Meta said it will publicly respond to each of the board’s recommendations on the matter by mid-March.

 – © AFP 2023

Your Voice
Readers Comments
12
This is YOUR comments community. Stay civil, stay constructive, stay on topic. Please familiarise yourself with our comments policy here before taking part.
Leave a Comment
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.

    Leave a commentcancel