Extremism finds fertile ground in gamer chat rooms
There are rules people must agree to before joining Unloved, a private discussion group on Discord, the messaging service popular among video game players. One rule: “Don’t disrespect women.”
For those on the inside, Unloved functions as a forum where about 150 people embrace a misogynistic subculture whose members call themselves “incels,” a term that describes those who identify as involuntarily celibate. They share some harmless memes, but also joke about school shootings and discuss the attractiveness of women of different races. Users in the group – known as a server on Discord – can enter smaller rooms for voice or text chats. The name of one of the rooms refers to rape.
In the vast and growing world of gaming, views like these have become easy to come across, both within individual games themselves and on social media services and other sites, such as Discord and Steam, used by many gamers.
The leak of a trove of classified Pentagon documents about Discord by an Air National Guardsman who held extremist views brought renewed attention to the fringes of the $184 billion gaming industry and how discussions in online communities can manifest in the physical world.
A report released Thursday by the NYU Stern Center for Business and Human Rights underscored how deeply entrenched misogyny, racism and other extreme ideologies have become in some video game chat rooms, and offered insight into why people who play video games or socialize online seem to be particularly vulnerable to such views.
The people who spread hate speech or extreme views have a far-reaching effect, the study argued, even though they are far from the majority of users and occupy only parts of some of these services. These users have built virtual communities to spread their harmful views and to recruit impressionable young people online with hateful and sometimes violent content – with relatively little of the public pressure that social media giants like Facebook and Twitter have faced.
The center’s researchers conducted a survey in five of the world’s major gaming markets – the US, UK, South Korea, France and Germany – and found that 51 percent of online gamers reported encountering extremist statements in multiplayer games during the game. last year.
“It may well be a small number of actors, but they are very influential and can have a big impact on player culture and the experiences of people in real-world events,” said the report’s author, Mariana Olaizola Rosenblat.
The historically male-dominated world of video games has long struggled with problematic behavior, such as GamerGate, a long-running harassment campaign against women in the industry in 2014 and 2015. In recent years, video game companies have vowed to improve their work culture. and employment processes.
Gaming platforms and adjacent social media are particularly vulnerable to outreach by extremist groups because of the many impressionable young people who play games, as well as the relative lack of moderation on some sites, the report said.
Some of these bad actors talk directly to other people in multiplayer games, such as Call of Duty, Minecraft, and Roblox, using chat or in-game voice features. Other times, they turn to social media platforms, such as Discord, which first rose to prominence among gamers and have since gained wider appeal.
Among those surveyed in the report, between 15 and 20 percent who were under the age of 18 said they had seen statements supporting the idea that “the white race is superior to other races,” that “a particular race or ethnicity should be expelled or eliminated,” or that “women are inferior”.
In Roblox, a game that allows players to create virtual worlds, players have recreated Nazi concentration camps and the massive re-education camps built by the Chinese Communist government in Xinjiang, a predominantly Muslim region, the report said.
In the game World of Warcraft, online groups – called guilds – have also announced neo-Nazi affiliations. On Steam, an online game store that also has discussion forums, one user named himself after the chief architect of the Holocaust; another incorporated anti-Semitic language into their account name. The report uncovered similar usernames associated with Call of Duty players.
Disboard, a volunteer-run site that lists Discord servers, includes some that openly advertise extremist views. Some are public, while others are private and invitation only.
One server labels himself Christian, nationalist, and “based,” slang that has come to mean not caring what other people think. The profile picture is Pepe the Frog, a cartoon character that has been appropriated by white supremacists.
“Our race is being replaced and shunned by the media, the schools and the media are turning people into degenerates,” reads the group’s invitation to others to join.
Jeff Haynes, a gaming expert who until recently worked at Common Sense Media, which monitors online entertainment for families, said: “Some of the tools used to connect and foster community, foster creativity, foster interaction can also be used to radicalize, to manipulate, to broadcast the same kind of coarse language and theories and tactics to other people.”
Game companies say they have cracked down on hateful content, established bans on extremist material and recorded or stored audio from in-game conversations for use in potential investigations. Some, such as Discord, Twitch, Roblox and Activision Blizzard – the maker of Call of Duty – have put in place automatic detection systems to scan for and delete prohibited content before it can be posted. In recent years, Activision has banned 500,000 accounts on Call of Duty for violating the code of conduct.
Discord said in a statement that it was “a place where everyone can find belonging, and any behavior that goes against that is against our mission.” The company said it banned users and shut down servers if they displayed hate or violent extremism.
Will Nevius, a Roblox spokesperson, said in a statement: “We recognize that extremist groups use a variety of tactics in an attempt to circumvent the rules on all platforms, and we are determined to stay one step ahead of them.”
Valve, the company that runs Steam, did not respond to a request for comment.
Experts such as Mr. Haynes say the fast-paced, real-time nature of gaming creates enormous challenges for police illegal or inappropriate behavior. Nefarious actors have also been good at avoiding technological obstacles as quickly as they can be erected.
However, with three billion people playing around the world, the task of monitoring what is happening at all times is virtually impossible.
“In the coming years, there will be more players than there will be people available to moderate the gaming sessions,” Haynes said. “So in many ways this is literally trying to stick your fingers into a dike that’s riddled with holes like a huge amount of Swiss cheese.”