Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Humans at their worst

Meet the people on the other side of the world who try to keep the Internet free from porn

The Moderators is a new documentary which tracks the work of a group of trainees in India.


Field Of Vision / Vimeo

WHY DO WE not see pornography on our Facebook feeds?

We know that ‘dick pics’ exist everywhere on the Internet, but our eyes are shielded from them on most sites. How?

Violence of the most graphic kind is filmed on a daily basis but assaults aren’t put in front of us without some effort on a user’s part.

How come?

The answer: moderators. One hundred and fifty thousand of them.

Across the world, there are about 150,000 men and women screening millions of photos every day to ensure the Internet’s surface is shinier than its murky underworld.

“That is more than twice the headcount of Google and nearly nine times that of Facebook,” according to filmmakers Adrian Chen and Ciarán Cassidy.

The pair delved into the little-known industry of content moderation in a new documentary.

The Moderators tracks a number of trainees at an Indian firm as they learn what their new job will entail.

“Here the person is not naked but you can see the erection… it will also fall under nudity,” a supervisor tells his newbies in the opening shot.

Another succinctly tells them why their new job is so important:

Have you seen pornography in Facebook? You don’t. Why? Or do you think people will not put? So why you don’t see? So, there are moderators.”

Throughout the week, viewers hear that without these companies – of which there are many, mostly based in India, the Philippines and Indonesia – the Internet would be “chaos”, “a porn factory” and other less desirable descriptors.

The moderators are expected to look at and approve or remove 2,000 photos per hour. It is work that puts ‘creepy’, ‘vulgar’ and ‘violent’ images in front of their eyes. About 20% of their quota will be deemed inappropriate for client websites.

The firm featured in the documentary has contracts with a number of dating sites and its employees are particularly aware of the importance of their role.

“The dating industry would not be flourishing as it is today online if moderators weren’t there,” says one senior.

“People go there to find their soul mate so people are very vulnerable there… that is where they even emotionally get connected to someone. It becomes very easy for an imposter to manipulate the person. So the other person behind the computer exploits your emotions if he is a scamster.”

During their training, the new workers are told that they will find the images they have to look at upsetting and, at times, disrespectful to their religion.

It is also noted that many of the people they are vetting live in different cultures – some people working in content moderation will see the mere act of dating taboo but will have to leave their feelings to one side.

“This is part of moderation, okay, it is our job. Don’t take it personally. You are Hindu, you can get some of the images like they are disrespecting your religion…” they are told.

You are here to moderate those stuff so that no one else can see. That is why you are here.

Chen first shone a light on the massive industry three years ago, writing an investigative piece for Wired from “the second floor of a former elementary school at the end of a row of auto mechanics’ stalls in Bacoor, a gritty Filipino town 13 miles southwest of Manila”.

There, he talked to workers who handled content moderation for US social media sites.

Describing a ‘vast, invisible pool of human labour’, he wrote: “Companies like Facebook and Twitter rely on an army of workers employed to soak up the worst of humanity in order to protect the rest of us.”

Two years later, Irishman Cassidy got in touch with Chen with an idea for a follow-on film. Financed by production company Field of Vision and on finding a firm willing to allow them access (a world first), a collaboration became possible.

A year later, and directed by Chen and Cassidy, an all-Irish crew watched as the Indian trainee group came “face to face with the worst of humanity as they learn the ins and outs of their unusual job – screening offensive content posted to the Internet”.

“It is such a huge industry but we know so little about it,” Cassidy tells TheJournal.ie in an interview today, ahead of the film’s worldwide release.

“[The documentary] is very much a snapshot told from the trainee’s perspective – from training to sitting behind the computer.”

He notes that the firm involved in the filming are one of the good ones who look after their staff and remain “small by standards”.

“There are warehouses elsewhere that are bigger – and maybe work for clients that would attract a different, more difficult, type of content.

“There are a lot of questions to be asked about the industry. Who is using it? How does it work? How does it affect the people doing it?”

Acknowledging the existence of The Moderators is just the start.

The film premiered for the first time a few weeks ago at SXSW in Austin Texas and was screened last week at the IFC in New York.  

Read: Eventbrite says it’s stumping up for GamerCon refunds – but the organiser disagrees

More: Nigerian anti-corruption agency, facing criticism for lack of action, finds $43 million sitting in empty flat

Your Voice
Readers Comments
20
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.