- The Jacob Wolf Report
- Posts
- Discord’s Growing Moderation Crisis
Discord’s Growing Moderation Crisis
As the chat platform welcomes more users, it faces new challenges.
On Saturday, 18-year-old Payton Gendron entered a grocery store in Buffalo, N.Y., and allegedly opened fire with an assault rifle. Ten people were left dead and three were injured—11 of those were Black.
Directly inspired by the Christchurch mosque shooter in 2019, who broadcasted his attack on Facebook and posted ahead of it on 8chan, Gendron reportedly took similar actions—livestreaming the shooting on Twitch and planning it out in the open over five months on Discord.
Since December, Gendron sent hundreds of messages in his private Discord server espousing hateful rhetoric and outlining his plans for the attack, according to a Bloomberg report.
Discord has leveled up since it launched in 2015. It’s no longer just a home to gamers, and it has not been for some time. It now further mirrors forum boards—such as Reddit, 4chan and 8chan—but with one catch: Its servers are considered private and therefore void of company moderation, without user-generated reports or its safety team actively searching for offending content.
When a user spins up a Discord server for any topic, unless it’s listed in Discord’s directory on an opt-in basis, it’s not considered publicly accessible. That makes those servers essentially private message rooms, without the benefit of the end-to-end encryption that Telegram or Signal provide, but with similar protections around lack of moderation.
“These are some of the most innovative, profitable companies in the history of capitalism,” Anti-Defamation League CEO Jonathan Greenblatt said on CNN Monday about the responsibilities online platforms hold in cases like this.
“You can see how they’ve tackled copyright infringement,” Greenblatt said. “You can see how they’ve tackled other issues. If they applied some of their energy and some of their innovation to this area, this issue could be resolved much more effectively today. They may also have to look at some of their policies.”
Violent extremism and illegal uses of Discord aren’t new. In 2019, Forbes published a search warrant the FBI obtained to gain access to a Discord server organized by a cybercrime group called Hells Gate. In July 2021, a North Carolina court sentenced 26-year-old B Mayuresh Suresh Iyer to 14 years in prison for distributing child pornography on Discord.
Since the shooting on Saturday, Discord has not issued a statement via any of its social media accounts. Instead, it continued to celebrate its seven-year company anniversary.
The company did speak to Bloomberg in its reporting, stating that it is cooperating with law enforcement and that it removed Gendron’s private server once it became aware of his messages. Discord did not respond to a request for comment from The Jacob Wolf Report on Tuesday with questions about its content moderation policies.
Like most social media platforms, Discord is covered by Section 230 of the Communications Decency Act of 1996. That removes liability for these sites and apps from being a publisher of the content on their platforms—in a way a newspaper could be sued if it prints something defamatory. By enacting Section 230, Congress hoped to increase freedom of speech online. In reality, it has allowed for those platforms to not diligently moderate illegal activity and bear no legal responsibility for real-life tragedies.
Where consequences have come for platforms is via their business partners. In January 2021, Amazon Web Services revoked its contract with Parler in the wake of the U.S. Capitol attack on Jan. 6. Parler also received a four-month temporary ban by Apple from its App Store, and the app has continued to be unavailable on Android after Google banned it, too.
Discord is a bigger player than Parler, serving more than 150 million monthly active users as of December and holding a valuation of more than $15 billion. It won’t face any consequences for allowing Gendron’s messages to fester for so long.
Villanova University law professor Brett Frischmann—whose work focuses on Internet law—says he hopes platforms will start enacting what he calls “friction by design” as a means to deter the virality of mass shootings and other illegal activity.
“You might think about what are the kinds of friction that ought to be required before things can go viral, before things get shared,” Frischmann said in a Tuesday interview with The Jacob Wolf Report . “That kind of friction could be time delays. Instantaneous footage, maybe livestreaming itself. There should be some kind of friction before it can reach a public audience.”
Gendron’s livestream on Twitch lasted approximately 25 minutes, but roughly 23 minutes of that broadcast were him driving in his car. Within two minutes of him beginning his alleged attack, Twitch’s moderation services caught it and ended the stream. His five months of messages and content on Discord were not removed until after he became identified as a suspect.
Discord has ramped up its moderation policies over the past few years, beginning after the white supremacist rally in Charlottesville, Va., in 2017. That included the addition of an off-platform behavior clause to its community guidelines, which allows for the company to ban users for hateful, violent or criminal behavior that occurs elsewhere.
On Jan. 8, 2021, Discord removed a pro-Donald Trump spin-off server of the banned r/DonaldTrump subreddit, citing its ties to the Jan. 6 Capitol insurrection. Discord stated that it did not have evidence that any of the violent extremists at the event organized using the platform.
In March, it also published a transparency report, which said it had removed 1,687,082 accounts and 52,177 servers between July and December 2021, for policy violations other than spam.
Discord’s main problem, though, is that it currently relies on moderation by humans. Because of its user-privacy stance, no algorithms are searching content in users’ servers, in the same way those bots monitor Facebook, Twitter, YouTube and Twitch, each of whom have received their fair share of criticism for lack of policing hateful and violent content.
Thank you for reading The Jacob Wolf Report. This post is public so feel free to share it.