Adrian Chen wins November Sidney for spotlighting the workers who keep ‘dick pics’ and beheadings out of your Facebook feed | Hillman Foundation

Adrian Chen wins November Sidney for spotlighting the workers who keep ‘dick pics’ and beheadings out of your Facebook feed

Adrian Chen wins the November Sidney Award for “Unseen,” a Wired feature on the invisible army of contractors who spend their days sifting through all the porn, gore, and hate speech that users try to upload to social networks.

The industry calls them “content moderators” or “mods,” but some might call them censors. There are over 100,000 of them in the United States, and overseas. Social networking giants like Google are notoriously secretive about what content moderators do, and what criteria they use to do it.

Mods pick through all the child pornography, car accident footage and animal abuse videos that the world’s internet users attempt to share each day. Some services engage mods in response to complaints, others filter all posts in real time. Sometimes moderators have to bend the rules for content the company deems “newsworthy,” like footage of violent political protests.

For this work they may earn as much as $20 an hour in the U.S., or as little as $312 a month in the Philippines. The job takes its toll. Most people burn out within three to five months. The companies hire psychologists to evaluate prospective moderators and counselors to deal with their job stress, but PTSD-like symptoms are common.

“Chen is working on the cutting edge of labor journalism,” said Sidney judge Lindsay Beyerstein. “This story raises important questions about workers’ wellbeing and free speech on social networks. And it’s a fun read.”

Adrian Chen is a freelance writer in Brooklyn, New York. He is a contributing editor to The New Inquiry.

A contractor at the Manila office of TaskUs, a firm that provides content moderation services to U.S. tech companies
Photo credit: Moises Saman/Magnum

Backstory

Lindsay Beyerstein interviewed Adrian Chen by email

Q: How did you become aware of the content moderation industry?

A: I’d written some for Gawker about various outcries that erupted whenever Facebook deleted a politically sensitive image or profile for violating their content guidelines, and I always wondered how they determined what to delete–not from a policy standpoint but from a technical one. I’d assumed that it was some sort of algorithm with limited human involvement. But while working at Gawker, I was able to interview a former content moderator for Facebook, who worked as a contractor for $2 an hour, screening often-horrific images from his home in Morocco. That’s when I first realized that a significant amount of this work is being done by low-paid workers in developing countries, and as I looked into it more it became clear an entire industry is built on selling this labor to American tech companies and other businesses.   

Q: Why are these social media companies so reliant on contractors to do this work?

A: Obviously the biggest reason is cost. Moderation is a very labor-intensive process that needs to be on 24/7. You need a large number of workers, compared with the relatively small numbers of core employees tech companies have. Also, flexibility: Companies want to be able to quickly scale up or down their moderation capability depending on how fast they grow. So, they hire cheap workers in the Philippines, or young recent college grads in the Bay Area who don’t get benefits and can be laid off without a hassle. Many companies will tell you that they outsource this so they can focus on their “core” business but this is disingenuous. If allowing people to share content on their service is their core business, moderating this content is just as central.

Q:  A lot of this work is done overseas. The wages are much lower, but so is the cost of living in these countries. Are these workers compensated fairly?

A: The content moderators I spoke to in the Philippines were paid in line with other workers in the outsourcing industry. One moderator for Microsoft ended up making $500/month after three years at his outsourcing company. This is pretty good in a country where something like a quarter of the population lives on less than $1 a day. Although as a low-skill, non-voice job, moderators typically make less than many people doing, say, phone-based tech support.

Q: The stress of looking at so much repulsive material wears people down. Can you get PTSD from this?

A: I’m not sure about the clinical diagnosis, but the two mental health experts I spoke to who have dealt extensively with content moderators say many exhibit PTSD-like symptoms: paranoia, “compassion fatigue” where they become sort of deadened to human suffering, depression, sexual dysfunction, etc. It of course depends on what kind of images they see, and how good the work environment is. I think it’s also important to note that the pressure of having to keep up with their quotas was as hard for some workers as the images themselves. It’s the combination of the repetitive, stressful work and the horrific images that can make the work especially degrading. And if they’re only temporary contractors, the sense of being literally disposable also contributes to the stress.

Q: Facebook and other major players are very closed-lipped about their content moderation policies. Should users have a right to know what they’re not seeing, and why?

A: Facebook, Google and Microsoft are absurdly opaque about the process of content moderation, considering how crucial it is to their business. I hope that my article has prompted journalists with better relationships with these companies to ask more questions about their content moderators.

From a basic consumer rights angle, people should know that their personal information is being sent potentially thousands of miles away, to a third-party about whose relationship to the client company almost nothing is known. There is also the more philosophical question of whether dumping the worst humanity has to offer onto an invisible army of low-paid workers so we can share vacation photos blissfully unaware is the best solution the tech industry can come up with for this problem. And if it is, how are these companies making sure these workers are cared for and fairly compensated for the very clear discomfort and risk they take on for the rest of us? We can’t start to have that conversation without knowing who these workers are.

Q: A few months ago, the Hillman Foundation honored a feature about the so-called “Mechanical Turk,” which connects tens of thousands of digital pieceworkers to big companies to perform repetitious tasks from home. Did you find any evidence that Turkers are involved in content moderation?

I looked into this, and there is definitely some lightweight moderation happening through Mechanical Turk–I even did a little of it. But it seems that any serious content moderation requires a captive labor pool managed by an outsourcing firm. The moderation is too complicated to have random strangers on the internet do it. (Which is also why hardly any of it is automated.)

Q: You interviewed an academic who said that social networking companies want to downplay the human work that goes into creating digital spaces. Why is that?

A: Would you want to lay out your personal trash for everyone to poke around in? Social media companies are hyper-sensitive to the appearance of harboring child pornographers, or encouraging cyberbullying, so I think there is a simple “see no evil” philosophy driving some of the secrecy. Then there are sensitivities around outsourcing, of course. More fundamentally, content moderation undermines the idea that tech companies are ushering in a new information economy built on code and ideas–not the unpleasant manual labor done by content moderators.

Adrian Chen