Just days after Twitter announced it’s been toying with a reality in which white supremacy is banned from the platform, Youtube takes it one step further by announcing that it’s banning supremacy and extremism from its content platform in totality. The video sharing website announced on Youtube’s official blog on Wednesday that, in an ongoing battle against hateful content, removing and banning supremacy on the website was the logical next step. Implementing the ban, however, may be more complicated than it sounds. It’s a tricky situation, but implementing a ban can assure a safer environment for Youtube’s user community.
Youtube’s Ongoing Battle Against Extremism
Youtube isn’t unlike other social networking platforms in that it provides a place for hateful, extremist content and fake news to thrive. Its user-based content is given very little guidelines, since its users are protected by free speech and the philosophy that Youtube is a place where creators can share their work and content without judgement. However, it comes at a price. Conspiracy theory videos that allege that tragic terrorist events never took place can be shared en masse, white supremacist and other racist manifestos are given a platform, and voices that blur the lines of free speech and hate are given a place to thrive.
For this reason, it becomes paramount that Youtube take a stance on where the company sits regarding this content. By giving it a platform, the company fuels a dangerous fire that lead to real action. In the case of Myanmar’s Rohingya genocide, for example, Facebook has had to acknowledge and face that it played a major role in the spread of hateful propaganda that led to real, human death.
In its first big steps toward ridding itself of hateful content on the platform, Youtube adjusted its algorithm to get rid of conspiracy theory videos within its suggested content. While it didn’t outright remove the content just yet, it removed its ability to be suggested within the algorithm. This marked a major step toward removing the content altogether, as it took away any incentive to create the videos for good ranking within Youtube’s search algorithm. The company, which is owned by Google, had more plans to push new regulations further.
“Today, we’re taking another step in our hate speech policy by specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status,” Youtube wrote in an official statement that was posted to the company’s blog on Wednesday. “This would include, for example, videos that promote or glorify Nazi ideology, which is inherently discriminatory. Finally, we will remove content denying that well-documented violent events, like the Holocaust or the shooting at Sandy Hook Elementary, took place,” continued the company.
But with censorship comes the argument that free speech has an inherent use, despite being often questionable. Along with Youtube’s announcement came the clarification that the content may, in the future, be available to researchers trying to better understand a heated situation. Again, looking at Myanmar, combating the situation and breaking down systems of oppression requires understanding the root of it in the first place. “We recognize some of this content has value to researchers and NGOs looking to understand hate in order to combat it, and we are exploring options to make it available to them in the future,” Youtube wrote. While there may be a good purpose for hateful and extremist content, Youtube clearly feels that it has no place to be allowed in a public sphere.