Individual Control over Exposure to Combat Toxicity in Games

Giving players and communities control over how much—and what types of—toxicity they are exposed to may be an effective way of combating it.

Julian FrommelRegan Mandryk

March 14, 2024,

Toxicity is a reality of online multiplayer games. Most players know that games are toxic; the industry recognizes it as a problem, prompting increased research on its causes and strategies to mitigate its harmful effects. Addressing toxicity involves reporting, automated detection, and actions like sanctioning and muting. Yet, combating toxicity remains challenging due to its normalization and subjective nature, with different individuals considering different things as toxic.

In this viewpoint, we argue that giving players and communities control over how much—and what types of—toxicity they are exposed to may be an effective way of combating it. In upcoming work, we (the authors) define toxicity as “disruptive behaviors that are perceived as harmful by others.” This definition intentionally focuses on the affected person and their experience, raising questions about why individuals do not have more ways of controlling what behaviors and communications they are exposed to.

It is clear that extreme or egregious forms of toxicity (e.g., hate speech and harassment) are never acceptable, regardless of how they are perceived. On the other hand, less severe forms of toxicity (e.g., trash talk and banter) are more of a gray area, necessitating specific consideration by communities and platforms that cannot rely on external guidance like anti-hate legislation to combat these disruptive behaviors that may be harmful to some, but not to others. In her essay on consent in games, Emma Vossen3 discusses trash talk in the context of the magic circle, a conceptual space separating play from the outside world. In such a context, rules can be different from the outside world. Vossen highlights that it is subjective what is acceptable trash talk compared to harassment and that trash talk is part of the game for some players. As such, more ambiguous forms of toxicity, such as trash talk, may be acceptable as part of competitive play.

On the other hand, she also describes how such behavior can break the magic circle for others who have different preferences and never consented to be exposed to it. In conclusion, Vossen argues that common suggestions—such as not playing games—are not a solution. Instead, it may be more useful to “think critically about our consent and boundaries,”3 which within the context of in-game trash talk, involves control over what one is exposed to.

What if we could do this in practice?

If we could provide tools to players and communities so they can choose what they are exposed to, we could give them more control over their environment. Game communities can shape codes of conduct for what is ok as part of a specific game; subcommunities, e.g., on game servers, can make their own rules, both banning behavior that may not fall under game-wide codes of conduct or explicitly allowing some behaviors like trash talk; player groups can agree on informal rules about the types of acceptable trash talk, and individual players can control their exposure. This would all be possible if we had better tools, such as more widely available and accessible muting and avoiding features, individual control over dictionaries used for automated filtering, or even individual thresholds for predicted toxicity of filtered messages. Individuals and communities would be able to control the amount and type of trash talk and game banter they were exposed to.

If we could provide tools to players and communities so they can choose what they are exposed to, we could give them more control over their environment.

Giving players and communities control over their exposure to toxicity would ideally reduce harm. In the current world, trash talk—according to our definition—is usually considered toxic because someone perceives it as harmful. In a world where people have control over which types of behaviors they are exposed to, it will still be trash talk but would not be perceived as harmful by others—thus, it would not be toxic to those who consented. While a top-down approach to defining codes of conduct will always miss some behaviors or language that is harmful to an individual, control over exposure would allow them, for example, to define terms that they perceive as harmful and do not want to see even though they are acceptable at a community level.

Further, a control over exposure approach can complement top-down approaches to combating toxicity, which is currently mostly the responsibility of game makers who do this with sanctions. If players had more control over the type of content that they were exposed to, this would help them in moderating spaces because there will be less ambiguity and fewer situations where someone is exposed to behavior they perceive as harmful. Game makers could help players by allowing, applying, and improving individual control approaches. Right now, even de facto standard approaches like reporting, muting, or avoiding are not implemented in all games, or usable, accessible, or integrated well enough (e.g., transparent2), and codes of conduct are often inaccessible and framed around legal language that is hard to understand.1 By improving this, game makers could provide more individual control, ultimately combat toxicity and its harm, and make their own task of moderating game communities easier.

The approach here is not without risk and has the potential to create new challenges. How would this work in game settings with a dominant culture or an imbalance of power, impeding an individual’s ability to express their preferences? How can players play with each other if they disagree on what is acceptable? How can we avoid this approach leading to echo chambers and even more normalization of toxicity in some communities? How can we prevent individual players from being ridiculed for their preferences? How do we design a system that is accessible and effective for expressing and controlling preferences?

We need more research into the effects of a control over exposure approach before we can comfortably suggest implementing it at scale. While we understand that this may not be the silver bullet that will solve toxicity, it may be worth reconsidering approaches to combat toxicity that largely have failed at solving the problem. Importantly, with this approach, we do not want to downplay the systemic problems of toxicity and hate in games, argue for more lenience with harm in games, or absolve game makers and other platform providers of their responsibility to combat toxicity. Rather, we think that this approach may complement top-down approaches in a way that acknowledges the reality of normalized toxicity in gaming spaces. If we could implement an approach that allows individual and community control over exposure, players could experience more autonomy and control and ultimately experience less disruptive behavior that they perceive as harmful.

References

[1] Grace Thomas D.Larson Ian, and Salen Katie2022Policies of misconduct: A content analysis of codes of conduct for online multiplayer gamesProc. ACM Hum.-Comput. Interact. 6, CHI PLAY, Article 250 (October 2022), 23 pages.

[2] Kou Yubo and Gui Xinning2021Flag and flaggability in automated moderation: The case of reporting toxic behavior in an online game community. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI ’21)Association for Computing MachineryNew York, NY, Article 437, 1–12. 

[3] Vossen Emma2018The magic circle and consent in gaming practicesFeminism in Play. Palgrave Games in Context, K. Gray, G. Voorhees, and E. Vossen (Eds.). Palgrave MacmillanCham.

Julian Frommel

Julian Frommel

Julian Frommel is a tenured assistant professor in the Interaction/Multimedia group at Utrecht University. Julian's current research in human-computer interaction and human-centred AI investigates benefits and harms of interactive systems.

Regan Mandryk

Regan Mandryk

Regan Mandryk is a professor of Computer Science at the University of Victoria, Canada. In her work, she designs, develops, and evaluates novel games that improve the social and emotional well-being of people, while also investigating factors (e.g., toxicity, obsession) that undermine the benefits of play.

Share this story. Choose your platform.

Want more updates and information from ACM? Sign up for our Newsletter.

Related Articles

  • An Empirical Study of VR Head-Mounted Displays Based on VR Games Reviews

    In recent years, the VR tech boom has signaled fruitful applications in various fields.

    Yijun LuKaoru OtaMianxiong Dong

    August 22, 2024,

  • A lighthouse illuminates the water against a sunset background.

    Inaugural Editorial: A Lighthouse for Games and Playable Media

    Helping readers to find their way in this new, ever-shifting reality of games industry and research.

    Sebastian DeterdingKenny MitchellBrad KingRachel Kowert

    March 13, 2023,

  • A hand holds a smartphone showing Pokemon GO game in a supermarket aisle.

    Pokémon GO as an Advertising Platform

    This article explores a notable gap in the literature on locative media: the impacts of LBA on small businesses in the location-based game Pokémon GO.

    John DunhamJiangnan XuKonstantinos PapangelisNicolas LaloneMichael SakerDavid Schwartz

    August 11, 2024,

  • Adventures in Avoiding DAIS Buffers

    After watching Horizon Forbidden West’s presentation on their usage of visibility buffers, it became apparent my approach was suboptimal.

    Baktash Abdollah-Shamshir-saz

    August 16, 2024,