After it was revealed that 8Chan played a major role in the El Paso shooting that killed 22 and injured 24 on August 3, legislators are calling for more supervision and possible censorship over what can be shared online in a public forum. 8Chan, which is a discussion board website that was located in one of the darker corners of the web before it was shut down after the El Paso shooting, seldom moderated its discussion boards with the argument that it threatened its users’ rights to freedom of speech. The shooter, however, was reportedly active on the website and even posted his manifesto to the discussion boards before going through with the attack.
8Chan’s Security Compromised After Attack
8Chan began as a place for people to gather and discuss topics that were otherwise banned on other, more mainstream platforms. Where a group of incels (involuntary celibates) may get banned from Facebook or Twitter for discussing violence against women, they could thrive in the 8Chan environment where no one is looking. It quickly became a hotspot for people to discuss anything from QAnon—the controversial anti-government conspiracy theory that was recently revealed to be on an FBI watchlist—to violent racism where people cheered one another on to execute terrorist attacks, often referring to mass shootings as if they were a game of catch and kill.
After the El Paso attack revealed that 8Chan plays a major role in the development of extremist behavior, its cloud security server, Cloudflare, dropped it as a client, opening it up for malware attacks and security breaches. The company is now urging other hosting servers and security services not to work with controversial websites and companies, as they could pose a significant risk to business—obviously, because people don’t want to work with companies that support or create a pathway for violent extremism. In filings that went to investors, it was revealed that Cloudflare has had issues with its clients behavior in the past.
The lack of security forced 8Chan to shut down while it searched for another host. But once word got out that it played a role in the formation and execution of a terrorist attack, its prospects were few and far between. Banning a website like 8Chan is controversial in itself, but so is allowing one to run free, void of any responsibility to maintain public safety. Sites like Facebook, which have admitted to their role in extremism, are working hard to change their terms and conditions and moderate behavior and discourse that lead to the execution of violent attacks. Legally, internet hosting services and social media platforms don’t have to intervene in user conversation when things become violent—but ethically, they should.
The U.S. Government Gets Involved
8Chan, if anything, serves as a major kingpin in an argument for greater censorship and moderation on online forums (which has already been somewhat established, just not universally). After the El Paso shooter published a manifesto on 8Chan before executing his attack, the U.S. Government is aiming to get more involved with why the website allowed for content like that to be published in the first place. The House Homeland Security Committee served a subpoena to 8Chan’s owner, Jim Watkins, to testify before Congress and discuss how the website has worked to combat any violent extremism.
A probe into how social media platforms fuel extremist behavior has been going on for quite some time. Websites like 8Chan only serve to further a suspicion that social media and discussion websites are often breeding grounds for illegal activity—whether it be drug deals and organized crime or impulsive terrorist attacks that risk the lives of hundreds.