Games as Social Platforms

Online game companies can and should be included in new social media regulations.

Constance Steinkuehler

March 3, 2023,

In September 2022, the California governor signed into law sweeping state legislation (A.B. 587) demanding transparency among social media platforms as to their content moderation policies regarding hate speech, disinformation, and extremism. In it, policymakers don’t just hold tech companies responsible to lawmakers; they hold them responsible to the public as well by forcing companies to file semiannual reports detailing their policies toward hate speech, extremism (radicalization), disinformation campaigns, harassment, foreign political interference as well as how they protect their consumers against it via content flagging, moderation, and consumer-side consequences. Such reports will be made public in a searchable repository on the Attorney General’s website. California is not unique here. “More than 100 bills in state legislatures across the country aimed at regulating social media content moderation policies”,7 and similar bills are being hammered out in front of the Supreme Court.

Online game companies can and should be included in these new regulations.

“Online game companies need to start being more honest about what is happening in their games and what they are actually doing to protect consumers (and democracies).”

Online games meet the definition of “social media companies” as detailed in the bill. They offer pubic or semipublic internet-based services in which one substantial function is to connect players so they may interact socially within the game, and players (1) create public or semipublic profiles, (2) populate a list of other players to whom they are connected in the system, and (3) post content shared with others (in the form of chatroom style messages) that includes content generated by other players.

For those of us working on the front lines of toxicity and extremism in games (Rachel Kowert, Alex Newhouse, Petra Regeni, Linda Schlegel, Jessica White, and others), such regulatory effort would be a much welcomed and long overdue relief, setting up the means for researchers to finally get their hands on reliable industry-side data as to the real nature and extent of the problems of toxicity and extremism in online games.

But more importantly, we need online games to be included in such policies so that consumers can make informed decisions about the games they buy for themselves and for their kids. With such data made available, public-serving (and public-protecting) organizations such as the Anti-Defamation League (www.adl.org) and Common Sense Media (www.commonsensemedia.org) can make real strides in establishing a rating system beyond just company-selected snippets of gameplay voluntarily submitted to the ESRB to finally, at long last, equally include empirical data on online player-generated gameplay content. It is an accepted fact, since the earliest foundational work on MUDs and MOOs a good thirty years ago, that online games are part company designed and part community authored; they are social worlds that are “corporate owned but player constituted”.5 And it is high time that policy and law caught up to this fact.

Online game companies need to start being more honest about what is happening in their games and what they are actually doing to protect consumers (and democracies).

TL Taylor is right to argue that “We need more engagement with the tradition of community management (understanding the ongoing, cultivating nature of working with users), and better understanding of how online spaces might leverage positive socialization and practice restorative, not just retributive, modes of justice.”6 Simple “ban hammers” are crude instruments for player-on-player disruptive behaviors and outright hate-based harassment. That’s one reason why the work of Dan Cook1 on social design and player trust, Kimberley Voll and Fair Play Alliance colleagues,4 and Weszt Hart3 of Riot Games is so crucial.

But it is also true that large game corporations who profit wildly off the public, particularly during the two years of a global pandemic, falsely advertise their online wares as positive, playful, online spaces where we are “united through the universal joy of play,”2 all the while, lobbying behind the scenes to block efforts to better protect players from harassment and hate. All in the name member company profit.

“The costs of toxicity and extremism in online games are not only social, psychological, or moral; they are also financial.”

Indeed, every conversation I have with colleagues in the c-suite of game companies readily admits (that, until the problems impact the bottom line) there really is little motivation to fix it.

But the costs of toxicity and extremism in online games are not only social, psychological, or moral; they are also financial. Data from our most recent survey and interview study among online game players ages 13-25 (n=602, power=80%, alpha=0.05, Cronbach’s alpha 0.66 and above on all items) show that roughly half (49.3%) of players report avoiding some game titles due to toxicity among its player base. Indeed, the average monthly spend on online games deemed “non-toxic” by players was $21.10 compared to $12.09 on toxic games – a full 54% gain in revenue for games that don’t sell consumers spewing name calling, racial epithets, holocaust denial, misogyny, threats to one’s safety and your garden variety rape and death threats. Thus, the costs of toxicity online are not merely spiritual; they are financial realities, largely hidden from the books of major game companies because they simply don’t account for profits overlooked.

Hate speech, extremism (radicalization), disinformation campaigns, harassment, and foreign political interference aren’t just ugly, they’re expensive.

References

  1. Cook, D. Vision. Games: Research & Practice 1 (1), (2022).
  2. Essential facts about the video game industry. Entertainment Software Association (2022); https://www.theesa.com/resource/2022-essential-facts-about-the-video-game-industry/.
  3. Hart, W. Player dynamics design: Looking behind the curtain. Riot Games (May 12, 2022); https://www.riotgames.com/en/news/player-dynamics-design-looking-behind-the-curtain.
  4. Lewington, R. and The Fair Play Alliance Executive Steering Committee. Being ‘targeted’ about content moderation: Strategies for consistent, scalable and effective response to disruption & harm. Fair Play Alliance White Paper. (April 2021); https://fairplayalliance.org/whitepapers/.
  5. Steinkuehler, C. The mangle of play. Games & Culture 1, 3 (2006), 1-14.
  6. Taylor, T.L. Games matter. Games: Research & Practice 1, 1 (2022).
  7. Zakrzewski, C. New California law likely to set off fight over social media moderation. The Washington Post.  (September 14, 2022); https://www.washingtonpost.com/technology/2022/09/13/california-social-network-transparency.

Copyright

© 2023 Copyright held by the owner/author(s).

Constance Steinkuehler

Constance Steinkuehler

Constance Steinkuehler is an American professor of Informatics at the University of California–Irvine. She previously taught at the University of Wisconsin-Madison before taking public service leave to work as a Senior Policy Analyst in the Office of Science and Technology Policy at the White House Executive Office, where she advised on policy matters about video games and learning.

Share this story. Choose your platform.

Want more updates and information from ACM? Sign up for our Newsletter.

Related Articles

  • Volume 1, Issue 4: December 2023

    Read through the informative articles in Volume 1, Issue 4.

    March 19, 2024,

  • Volume 1, Issue 3: September 2023

    Read through the informative articles in our inagural issue.

    March 19, 2024,

  • a collage of mathematical and scientific formulae and computing code.

    This Is a New Article Title for the Samantha/Kenny Demo

    This is an article about how to create an article.

    Constance SteinkuehlerKenneth MitchellSamantha Hannah

    March 18, 2024,

  • Volume 1, Issue 1: March 2023

    Read through the informative articles in our inagural issue.

    March 18, 2024,