It was announced on Tuesday that Facebook will ban QAnon and related conspiracy theories associated with many disinformation campaigns. The company has been struggling to control the spread of misinformation ahead of the 2020 election, and QAnon presence on the platform has upended much of that work in recent months.
QAnon is a popular conspiracy theory that began in the far corners of the internet back in 2017 after the Pizzagate conspiracy theory sparked ahead of the 2016 election. The conspiracy theory, which once existed only within the fringes of society, has increasingly centered itself into mainstream politics ahead of the 2020 election. The theory suggests that Hollywood and the Democratic Party is run by a group of elite pedophiles that torture children and harvest their adrenaline to use it in a drug that extends their youth and beauty. Celebrities like Chrissy Teigen and Tom Hanks have fallen target to QAnon supporters, as well as political figures like Bill Clinton.
Facebook began cracking down on QAnon groups and related content earlier this summer, when the conspiracy theory began making its way into the mainstream by re-appropriating calls to end human trafficking on social media. On Instagram, hashtags related to human trafficking saw an uptick of QAnon related content, acting as a radicalization tool for curious activists by appealing to their moral code and gut reaction.
Memes that spread falsified statistics about human trafficking with no reference information spread like wildfire, leading millions of people to think that the problem is much worse than it might actually be. Attempts at fact checking this information are futile, especially since the nature of QAnon pushes a distrust in news media companies by alleging that they, too, are part of this cabal of pedophiles.
Facebook said on Tuesday that it would be expanding on its policies made earlier in 2020 to find and remove content associated with QAnon. “Starting today, we will remove any Facebook Pages, Groups and Instagram accounts representing QAnon, even if they contain no violent content,” said the company in a press release. “This is an update from the initial policy in August that removed Pages, Groups and Instagram accounts associated with QAnon when they discussed potential violence while imposing a series of restrictions to limit the reach of other Pages, Groups and Instagram accounts associated with the movement,” the press release went on to say.
One of the biggest concerns with QAnon is not necessarily its violent threats—though the conspiracy theory has been labeled a threat by the FB—but its ability to create and spread misinformation like wildfire. The company is acknowledging this by targeting all accounts found to be promoting the conspiracy theory, especially after people have expressed how the conspiracy theory has not only created political harm, but tension among individual relationships. Accounts connected to the theory are known to post other conspiracy theories and misinformation, such as misleading statistics or information about topics like the COVID-19 crisis, the California Wildfires, mail-in voting or climate change.
“We’ve been vigilant in enforcing our policy and studying its impact on the platform but we’ve seen several issues that led to today’s update,” Facebook said. “For example, while we’ve removed QAnon content that celebrates and supports violence, we’ve seen other QAnon content tied to different forms of real world harm, including recent claims that the west coast wildfires were started by certain groups, which diverted attention of local officials from fighting the fires and protecting the public. Additionally, QAnon messaging changes very quickly and we see networks of supporters build an audience with one message and then quickly pivot to another. We aim to combat this more effectively with this update that strengthens and expands our enforcement against the conspiracy theory movement,” the press release said.
Facebook is the latest social media company to enforce a stricter policy on such conspiracy theories. YouTube recently banned conspiracy theories after first promising to remove them from the platform’s suggested video algorithms, and Twitter and TikTok have also cracked down on the conspiracy theory in recent months.