Youtube announced on Thursday that it would be taking steps toward removing harmful conspiracy theory videos from its platform, including the QAnon conspiracy theory that’s been the subject of many policy changes on social media recently. The video hosting giant published a statement on Thursday to outline its intentions and timeline for the new policy change.
“Today, we are taking another step in our efforts to curb hate and harassment by removing more conspiracy theory content used to justify real-world violence,” the statement read. The company will begin flagging and removing content that seeks to indoctrinate users into conspiracy theories like QAnon starting this week. The company says that it will take some time to fully implement this new ban, but many are reporting that some of the bigger accounts that had published popular QAnon videos in the past are already gone.
Youtube’s expansion on its policies regarding conspiracy theories comes after the company removed the content from its suggested videos algorithm in the past. Users that were watching seemingly innocuous videos on the platform would be lured into conspiracy theory content through the suggested videos algorithm, which served as a trail of bread crumbs for many. The company also said that it would remove any hateful content related to popular conspiracy theories, such as a user engaging in targeted harassment by claiming that a person is a pedophile for, say, disagreeing in criticism surrounding a popular Netflix movie that became a target of QAnon earlier this year.
The decision represents a significant step, but further expresses the inherent issue that social media companies are facing in the age of information (and misinformation). Understanding these issues requires expertise and the ability to recognize nuanced dogwhistles—especially as the ideas are banned on the platforms. “Managing misinformation and harmful conspiracy theories is challenging because the content is always shifting and evolving. To address this kind of content effectively, it’s critical that our teams continually review and update our policies and systems to reflect the frequent changes. Today, we are taking another step in our efforts to curb hate and harassment by removing more conspiracy theory content used to justify real-world violence,” says a press release published by Youtube.
Youtube, which has determined that conspiracy theories like QAnon have the ability to incite real world harm, credits this as a major factor in its decision to ban and remove much of the conspiracy theory content on its platform.
If you have a loved one that has fallen into the QAnon rabbit hole, read our resource guide on how to speak with them about it here.