While members of Congress waste time grandstanding about “censorship” and politicians threaten to jail tech executives for their moderation choices, actual trust & safety teams are quietly doing work that saves lives in active war zones. But you’d never know it from all the political theater.
That’s why we’ve partnered with the International Committee of the Red Cross to launch Trust & Safety: Armed Conflict, a sequel to our Trust & Safety Tycoon game that focuses specifically on the impossible decisions these teams face during real armed conflicts.
In this game you have been hired to manage the “Conflict and Crisis Team” of our fictional social media company, Yapper, just as armed conflict is breaking out between two fictional countries, Alpesia and Solferinia. You will need to make a number of decisions designed to make sure you’re not doing damage to the people living through an active crisis, while still considering the public perception of Yapper itself… all while making sure you actually have the team capacity to manage everything going on.
As trust & safety has become a political football—with many treating content moderation as some grand conspiracy rather than the messy, impossible work it actually is—the real consequences of these decisions have been completely lost in the noise.
This isn’t some abstract thought experiment. Trust & safety teams are regularly dealing with impossible decisions in war zones, understanding coded language that may (or may not) be inciting further violence, balancing the diverging and conflicting demands from many different parties (some of whom are at war with each other), and recognizing that demands for data might be about helping refugees… or might be about figuring out who to target.
Get things wrong and people can die.
Hard working teams are dealing with challenges like this all the time, while some people seem to think their biggest challenge is whether or not they should be letting people call each other slurs.
Maybe if more people understood what trust & safety teams actually do—day in and day out, in crisis zones around the world—we could have more productive conversations about content moderation instead of the current mess where every moderation decision gets filtered through the narrow and distorted lens of American political grievances.
As with our previous games, this game was a collaboration between our Copia Gaming effort and Randy Lubin at Leveraged Play.
From Techdirt via this RSS feed