From May 2020 to June 2021, the EViD lab ran a series of trials to design the Content Moderation by Design (CMbD) Game. By playing the role of both a startup social media platform (Contentr) policy trust and safety team and a content moderator, participants can begin to experience some of the challenges associated with moderating user generated online content in a way that balances values such as free expression and community safety.
There are three rounds to the game: first players work as a team to develop the policy that will help shape the kind of platform they want to grow. Then players switch roles from policy developer to content moderator, where they’ll use the policies to make moderation decisions, based on real life examples. Finally, players see how decisions play out, calculating final scores based on real life examples of moderation decisions, and how those decisions affect two areas: free expression and community safety.
Social media content moderation practices vary from company to company, are inherently opaque and span well beyond simply allowing or banning content. This game is meant to give a taste of the challenges posed by hosting a site for user generated content but should in no way be interpreted as a comprehensive overview of trust and safety practices.
The categories of content and roles were informed by:
Block, Hans, et al. (2018).The Cleaners. (film)
Gillespie, T. (2018). Custodians of the Internet: Platforms, content moderation, and the hidden decisions that shape social media. Yale University Press.
Roberts, S. T. (2019). Behind the screen: Content moderation in the shadows of social media. Yale University Press.
The game design research was sponsored by the National Science Foundation under award CNS-1452854. UMD IRB 1682807-2.
Special thanks to everyone who participated in game design trials!
Direct questions to firstname.lastname@example.org