In Wired, Adrian Chen profiles the small, poorly-paid army of overseas content moderators who preview facebook, YouTube, and other social network uploads to nix some of the truly horrible stuff that some users try to foist on the unsuspecting public.
Other content moderation is done by American workers who earn more money, but who typically burn out after just a few months on the job, previewing the absolute worst of what the internet can dish up:
But as months dragged on, the rough stuff began to take a toll [on Rob, a recent U.S. college grad]. The worst was the gore: brutal street fights, animal torture, suicide bombings, decapitations, and horrific traffic accidents. The Arab Spring was in full swing, and activists were using YouTube to show the world the government crackdowns that resulted. Moderators were instructed to leave such “newsworthy” videos up with a warning, even if they violated the content guidelines. But the close-ups of protesters’ corpses and street battles were tough for Rob and his coworkers to handle. So were the videos that documented misery just for the sick thrill of it. [WIRED]
Chen’s piece offers a rare glimpse of an international workforce that slogs away behind the scenes to make sure that social networking spaces are free of disturbing content.
The piece doesn’t really grapple with the larger cultural implications of private, profit-driven companies engineering their spaces to give the minimum offense–sometimes minimal transparency or accountability to their millions of users about what’s forbidden or why. Facebook was criticized for censoring some photos of breastfeeding, an activity that is not only legal, but endorsed by public health experts worldwide. The social network eventually backed down in the face of public criticism.
Hat tip: Amanda Marcotte