Arcade Showcase Game at the Hand Eye Society’s Wordplay Festival 2018

Josh Labelle’s good, unsettling game pits you against obscene user content as quality control for a social network. Labelle includes pixelated art with descriptions, and gives you a cheatsheet of guidelines to decide what stays or goes. Watch out for despair. “ - Caroline Delbert, Autosave.TV

Believe it or not, when you flag a post on Facebook a living human being typically has to look at the post and determine whether it's actually in violation of community standards. The bulk of this work is not done by algorithms.

These "social media content moderators" are often third-party contractors in newly industrialized nations like the Philippines .They have to memorize thousands of complicated rules regarding community standards... but must often fall back on their gut instinct. They're paid as little as $2 US per hour and have an average of 4 seconds to decide whether to delete or keep each post. They're expected to get through tens of thousands of these posts in a day and many of the images and videos they look at contain disturbing subject matter: torture, beheadings, murder, exploitation of children, extreme gore, animal abuse.

They receive no psychological counseling to help deal with the images they're forced to see.

In Sentry, you play as Louise, a young content moderator at one of these contractors. While at work, you'll decide which images to leave up and which to delete. While at home, you'll deal with the effects of being forced to sift through the entire world's exported psychic trash. Can you make it one whole week?

For more information on the true stories behind this game, listen to this recent episode of Radiolab: