Families of Buffalo Massacre Victims Sue Meta, Reddit, and Google Over Conspiracy Theories

The lawsuit seeks changes to the changes companies’ safety standards, with the plaintiffs calling the platforms “defective and unreasonably dangerous.”

We may earn a commission from links on this page.
Image for article titled Families of Buffalo Massacre Victims Sue Meta, Reddit, and Google Over Conspiracy Theories
Image: Spencer Platt (Getty Images)

It’s been just over a year since lone gunman Peyton Gendron travelled over 200 miles and open fired in a grocery store in Buffalo, New York last May. Now, the families of those wounded and murdered in the attack want to hold social media companies accountable with a wrongful death lawsuit because the gunman encountered racist Great Replacement conspiracy theories online.

The lawsuit was filed yesterday in the New York Supreme Court in Buffalo, and names Meta, Instagram, Reddit, Google, YouTube, Snap, 4Chan, Twitch, and Discord as defendants—Gendron planned the shooting for months on the latter. Also listed is Vintage Firearms, the store that sold the shooter the gun, and RMA Armament, the online retailer the shooter used to purchased body armor. The lawsuit seeks an unknown amount in financial damages as well as changes to the companies’ safety standards, with the plaintiffs calling the platforms “defective and unreasonably dangerous.”

Advertisement

“By his own admission, Gendron, a vulnerable teenager, was not racist until he became addicted to social media apps and was lured, unsuspectingly, into a psychological vortex by defective social media applications designed, marketed, and pushed out by Social Media Defendants, and fed a steady stream of racist and white supremacist propaganda and falsehoods by some of those same Defendants’ products,” the lawsuit reads.

Social media and the internet was a key factor both before and after the attack, which left 10 Black people murdered and 3 injured. Gendron was sentenced to life in prison in February. In addition being radicalized on social media as argued in the lawsuit, prior to the attack, Gendron researched the predominantly Black neighborhood online as a target. After the shooting occurred, videos and images of the gruesome scene proliferated online, with the mother of survivor Zaire Goodman claiming that she was “tagged” in a video of the massacre, according to Associated Press. Gendron also livestreamed the shooting on Twitch as it was occurring.

Advertisement

We deliberately designed Snapchat differently than traditional social media platforms and don’t allow unvetted content to go viral or be algorithmically promoted. Instead, we vet all content before it can reach a large audience, which helps protect against the discovery of potentially harmful or dangerous content,” Snap spokesperson Rachel Racusen told Gizmodo in an email.

A YouTube spokesperson told Gizmodo: “Through the years, YouTube has invested in technology, teams, and policies to identify and remove extremist content. We regularly work with law enforcement, other platforms, and civil society to share intelligence and best practices.”

Advertisement

Social media is a hotbed for conspiracy theories, a reality that was exacerbated by the covid-19 pandemic. In 2019, a researcher employed by Facebook found that accounts aligning with conservative views could be fed conspiracy content in as little as two days, with Facebook’s algorithm forcing users into endless rabbit holes based on their interests and political leanings. Reddit CEO told Wired in April 2023 that Reddit has cracked down on conspiracy communities on the platform—Reddit was known for its racist, homophobic, and misogynist leanings in its heyday—but also believes that regulating social media is a bit tyrannical.

Update July 13, 12:20 p.m. EST: This article was updated to include comments from Snap and YouTube spokespeople.