May 24, 2022
Payton Gendron, a white 18-year-old wearing military gear and livestreaming on the video platform Twitch, opened fire with an AR-15 at a supermarket in Buffalo on Saturday, May 15, killing 10 people and injuring two others in what authorities are describing as "racially motivated violent extremism."
Gendron, who drove into the predominantly Black community from a neighboring New York town about 500 miles away, shot 13 people both in the parking lot and inside a Tops Friendly Market. He fatally shot several people before aiming at a white man hiding in a check out line. Gendron said "Sorry!" to the man before he retreated away from him.
A half hour prior to the start of Gendron's livestream, several users on chat-based social media platform Discord received invitations to view an online diary allegedly authored by the gunman. In it, he detailed a months-long plan to livestream a violent attack on Black people.
The Buffalo mass shooting is the focus of a probe being conducted by New Jersey Attorney General Matthew Platkin, who released details of the upcoming investigation into both Twitch and Discord to determine whether the developers are violating consumer protection laws by failing to enforce policies against racist content and violent extremism.
"These social media platforms have enormous reach, especially with young people, and have shown themselves to be staging grounds for hateful and extremist content that may radicalize children and others," said Platkin. "New Jersey has a substantial interest in investigating how these companies moderate and prohibit content that may harm consumers. Under New Jersey law companies must deliver on their promises, and the persistence of violent extremism and hateful conduct on these platforms casts doubt on their purported content moderation and enforcement policies and practices."
In particular, the probe seeks to determine how content moderation on these sites are applied to minors under 13 years old who frequent the social media platforms. The state wants to determine whether Discord and Twitch's moderation policies allow for children and young adults to form entryways into radicalized, extremist belief systems.
Twitch said that the company removed the gunman's video within two minutes of it being uploaded. However, it was preserved and shared millions of times in the days following the attack, largely because of how easy it is to host video content on multiple platforms.
Twitch was popularized for video game livestreams, but has previously been used to broadcast violent extremism. In 2021, a series of "hate raids" on Twitch streams targeting Black and LGBTQ+ streamers caused an uproar among content creators who frequently use the platform.
Hate raids — in which some Black and LGBTQ+ Twitch streams were inundated with hateful comments from bot accounts — became the basis for a one-day boycott from some major content creators. Black streamers urged Twitch to do something to protect them. In response, the company added chat filters, requiring viewers to have a verified phone number before being allowed to leave a comment.
Rather than adding new content moderation policies, Twitch streamers told Allure in 2021 that they believed developers were attempting to repackage existing policies that weren’t doing enough to combat hatred on the platform.
Twitch has previously come under fire from Black creators, many of whom report consistent racist and abusive attacks from users. Twitch released a video in 2020 following the summer’s uprisings against systemic racism, but the one-minute clip mostly highlighted white creators and only had one line spoken by a Black streamer.
Discord said it removed Gendron's online diary upon discovery, and 15 users clicked on the invitation they received to join his private server.
Though Discord began as a platform for gamers to chat with one another while playing, it became a hub for white supremacists who used it to organize the 2017 "Unite the Right" rally in Charlottesville, Virginia. The gathering resulted in the injury of 34 people and the death of one woman, who was killed after being deliberately hit by a car.
Following the attacks in Charlottesville, Discord increased its content moderation in an effort to expand its business and attract more users. After Minneapolis resident George Floyd was killed by a now-convicted cop in May 2020, Discord developers detailed ways that the platform could work to combat hate and ensure the social media site was not used to spread violent extremism.
In 2021, it was discovered that Geoff Frazier, a former employee of video game company Blizzard, had spent months posting bigoted and hateful messages against women, disabled people and the LGBTQ+ community on "The Right Wing of Gaming" Discord server, which included details about how workers — particularly women — at Blizzard are "ruining the company."
A 180-page manifesto allegedly authored by the Buffalo gunman was circulated on 4chan and 8chan, two social media forums with lax content moderation. Both sites have become hubs of racist, misogynistic and hateful rhetoric, and have previously housed violent proclamations of other mass shooters.
The use of forums like 4chan and 8chan to spread harmful conspiracy theories and extremist ideologies is not new. A manifesto for the Christchurch, New Zealand mosque shooting in 2019 has become the basis for similar documents spread on social media in advance of mass violence. A gunman who shot 23 people in El Paso, Texas, also posted a document describing his radicalized belief system ahead of the shooting.
Since these social platforms are easy to access and do not have strict content moderation policies, they are often some of the first places that radicalized far-right Gen-Z teens and young adults go to find people who share their ideologies. Though many of Discord's servers are invitation-only, others are accessible through a searchable database, allowing people to connect with users around the world and build virtual camaraderie.
The Institute for Strategic Dialogue — an anti-extremism think tank — sought to discern how the far-right has become entrenched on the outskirts of social media platforms like Twitch and Discord. In its research, the ISD found that many of the far-right spaces on these platforms are publicly available.
The ISD also discovered that white supremacists would often livestream "Omegle Redpilling" on Twitch, where they would dress up in military gear or as characters like "Racist Super Mario" and search video chat forums for people to racially abuse. The streams would incorporate gory images or hateful messages about "out" groups, like ethnic minorities or the LGBTQ+ community, WIRED reported.
Gendron explicitly mentioned three Hasidic Jewish communities in New Jersey in his manifesto. Lakewood, Tom's River and Jersey City's growing Jewish communities were specifically discussed in the gunman's writings on anti-semitism. The portions were part of a larger document that explained the alleged racial motivations of the massacre.
The document — which was widely circulated in the days following the attack — goes into detail about how he was driven to 4chan in March 2020, eventually becoming a daily user.
Much of the document includes racist memes and other diatribes taken directly from 4chan, and paints a picture of the radicalization which led to Gendron becoming a proponent of The Great Replacement Theory. The conspiracy theory is based on the belief that demographic shifts are wiping out the white population with a deliberate intention to "replace" them with ethnic minorities and immigrants.
Prior to the New Jersey Attorney General's Office announcing the probe into Twitch and Discord, New York Attorney General Letitia James announced the state's own investigation into the "gamification" of the social media platforms and how they have contributed to an influx of violent extremism.