At the Nexus of Gaming and Extremism
Now is the time for us to invest in comprehensive strategies that build resilience of people who play games and that involve all stakeholders—researchers, policy-makers, the tech industry, and especially everyday gamers.
Galen has been a gamer his entire life: from Oregon Trail, to Half Life, then countless hours in Morrowind and DOTA, all the way up to Starfield today. He grew up hosting LAN parties in garages and building his own gaming rigs, planning to work in tech. However, instead he ended up researching wars and, more recently, the types of violent extremism that often are a prelude to particularly nasty sorts of conflicts. Over the past few years, he started to see othering and violent language and increasingly white supremacist and less ironic memes get more and more intense across online gaming communities. Rolling through DLive streams in 2019, one could go from Let’s Play videos of (semi)popular livestreamers to—within one or two recommendations—some of the most extreme far-right content inciting violent insurrections and genocide.
Jessica, not growing up a gamer, but coming from years of working in the security sector and researching terrorism and violent extremism, came to this same recognition of this increasingly extreme sentiment within online spaces. Her research looking at misogyny, racism, xenophobia, and other expressions of far-right extremism, and the increasingly mainstreamed nature of these types of beliefs due to progressively more populist politics and global crises like the pandemic, forged a match made in researchers’ heaven. Once Galen started fishing around for those interested in looking at this space, it quickly become apparent that we needed to form a network, so we could capture the learning and needs of researchers, policy-makers, practitioners, and industry—and thus the Extremism and Gaming Research Network (EGRN) was born.
Today, the Network has over 120 members ranging from governments to universities, think thanks, and tech platforms, and we are working together to untangle how extremists are exploiting the video games and communities we love—and how we can help make them more resilient against those exploits. This is not just a research puzzle for academics to crack or a security risk for policymakers to address. It is a societal issue that affects us all. Imagine your friend, your child, your student—an avid gamer—inadvertently coming across a server where extremist ideologies are being disseminated. This not only exposes them to hateful, vile content, but could also be a step down a dark, radicalizing path. The impacts of increasing exposure to hateful ideas, extremism, and hate-based discrimination are not confined to the virtual worlds of gaming; they infiltrate families, social circles, and, most frighteningly, our way of thinking about the world.
So, what can be done? There have always been dark corners of the internet fertilizing and spreading hate-based culture, norms, and more (remember the rules of the internet from 4Chan back in ’07?). Extremist uses of games are not new: We can chart neo-Nazi and jihadist titles dating back to the ’90s. But as games and the immersive, social digital environments around them have gotten more advanced, so have the tactics of extremists online. For example, trolling and sexism in gaming communities (looking at you, GamerGate) are getting more vitriolic and vile and white supremacist recruiters are actively exploiting these underlying trends in gaming to find new members. Learning more about these digital social communities and identities can help us to find better ways to work to remove hateful, extremist content and build positive environments to regulate harms and build preventative programming. However, it will take everyone working together, from researchers, to gamers, to policy-makers, to security practitioners, to game designers, to really effectively reinforce gaming spaces against exploitation by extremists and protect the billions of gamers existing today as well as future generations.
There are a few trends that members of the EGRN have charted that we think are particularly important. These can generally be thought of as organic, (i.e., an innate part of gaming) or strategic (i.e., specific and intentional use by extremist actors) [Englund and White 2023]1:
(1) | Online gaming and gaming-adjacent spaces such as forums, live streaming, and so on, help to organically form culture and identity. While that is positive for most users, the socialization of gamers, especially when taking place in online cultural milieus that normalize toxicity, can form “us versus them” group identities that can easily align with extremist ideologies [Kowert et al. 2023]2. |
Game developers can think about the impact of the narrative story lines written into games and how they might be used and abused by misogynist or racist extremist narratives. They can also think about representation and accessibility for gamers—does everybody get a chance to play and in a skin that resonates with them? This can be important to increasing diversity and inclusivity in gaming spaces [Wallner et al. 2023]3.
(2) | The personal identity formations and social bonds built through gaming together online can shape exceptionally strong community bonds. This pro-social aspect of gaming can also be exploited by, as we have seen, white supremacist and neo-Nazi groups that game together online as a bonding activity, which can indicate both organic and strategic use cases. |
Gaming companies need to therefore strengthen their terms of service and community moderation standards—so it is clear to gamers on their platforms that toxicity is not allowed, as well as where they can report or flag hate-based harassment when they do come across it.
(3) | We see terrorists and extremists creating their own video games and mods, from anti-Semitic titles like Ethnic Cleansing to more recent mods for Hearts of Iron IV and standalone FPS games produced by the Islamist group Hezbollah. We have charted at least 30 such titles from over the past two decades, most of which tend to appeal to users already at least somewhat aligned with the views of the producing group. This is where content moderation efforts come into play. |
Organizations such as the Global Internet Forum to Counter Terrorism and Tech Against Terrorism work to hash and flag terrorist content online so it is accessibly noted and companies can more easily find and remove it from their platforms. This, of course, is easier when the content is blatantly illegal under terrorism legislation—and more difficult when it falls to interpretation of extremism. However, this is where conversations like those happening within the EGRN can be useful to help inform industry safety by design and trust and safety efforts.
(4) | We also know that games, as pop culture, hold propaganda value. As such, when ISIS (Daesh) uses GTA IV and Arma 3 content in their recruitment videos, or far right groups use Viking content from Assassin’s Creed, we know they are intentionally tapping into cultural reference points. Where gaming companies are supporting influencers to help build and spread the culture of their games, it is important to ensure that the power these individuals wield is building positive and inclusive communities, not increasingly toxic spaces that contribute to the exploitability of certain games by extremists. |
These are just a few examples of issues at the nexus of gaming and extremism. It is a pressing issue that will only gain urgency as gaming technologies become increasingly sophisticated and as digital and physical realities intertwine more closely. Now is the time for us to invest in comprehensive strategies that build resilience of people who play games and that involve all stakeholders—researchers, policy-makers, the tech industry, and especially everyday gamers. By laying the groundwork today, we can avoid a future where the gaming worlds we escape to for enjoyment become the new battlegrounds for extremist ideologies.
Jessica WhiteGalen Lamphere-Englund
References
2023. The online gaming ecosystem: Assessing digital socialisation, extremism risks and harms mitigation efforts. Global Network on Extremism & Technology. Retrieved from https://gnet-research.org/2023/05/26/the-online-gaming-ecosystem/ .
2023. You are what you play: The risks of identity fusion in toxic gamer cultures. ACM Games Res. Pract. 17 (2023), 1–3. .
2023. Building resilience to extremism in gaming: Identifying and addressing toxicity in gaming culture. Radicalisation Awareness Network Policy Support, Retrieved from https://home-affairs.ec.europa.eu/system/files/2023-11/RAN-building-resilience-extremism-gaming_en.pdf .
Share this story. Choose your platform.
Want more updates and information from ACM? Sign up for our Newsletter.
Related Articles
An Empirical Study of VR Head-Mounted Displays Based on VR Games Reviews
In recent years, the VR tech boom has signaled fruitful applications in various fields.
August 22, 2024,
Pokémon GO as an Advertising Platform
This article explores a notable gap in the literature on locative media: the impacts of LBA on small businesses in the location-based game Pokémon GO.
August 11, 2024,
Adventures in Avoiding DAIS Buffers
After watching Horizon Forbidden West’s presentation on their usage of visibility buffers, it became apparent my approach was suboptimal.
August 16, 2024,