Live Streaming seems to be the Pandora’s Box of the internet’s dark, unspoken feud with terrorist activity. Twitch, an Amazon-owned company that helps gamers stream their activities, is at the heart of the latest controversy surrounding live streaming as the attacker behind a recent attempted shooting in a German synagogue allegedly live streamed his attack before it was stopped.
The gunman, who was stopped before he was able to enter a synagogue in Halle, Germany on October 9, managed to kill a man and a woman in the town square just outside of the building. Though the suspect has been arrested, a 35-minute video was allegedly live streamed on Twitch and expressed the gunman’s anti-Semitic views, which likely fueled the motive for the attack that took place on the Yom Kippur Jewish holiday.
Twitch responded to the live streamed attack on Twitter, where it publicly condemned the use of its platform to incite violence and terror among the masses. “We are shocked and saddened by the tragedy that took place in Germany today, and our deepest condolences go out to all those affected,” the company said in a series of Tweets posted on October 9. “Twitch has a zero-tolerance policy against hateful conduct, and any act of violence is taken extremely seriously. We worked with urgency to remove this content and will permanently suspend any accounts found to be posting or reposting content of this abhorrent act.”
Violence On Social Media
This type of attack is becoming increasingly common, as the Christchurch, New Zealand mosque shooter also live-streamed his attack on Facebook. The video was seen a couple of hundred times before it was flagged and removed, prompting whether live-streaming was a safe option for users on the social media platform. Because users are able to begin live streaming freely, it—much like every other aspect of Facebook’s user-based content—requires a certain level of self-policing in order to attract the attention of Facebook’s moderators.
Over the summer another website called 8Chan lost its hosting service when the El Paso shooter was revealed to have planned his attack on the moderator-free forum website. While the company may not condone such behavior on its website, its moderator-free forum allows users to freely discuss and plan any idea that is banned on other platforms. For this reason, 8Chan is often a breeding ground for illegal activity and white supremacy. However the website is now shut down, as it could not find a hosting platform that would let it continue to operate under the same lack of oversight.
Combating This Behavior Has Proven Difficult
Controversies such as the 8Chan platform and controversies surrounding live streamed content on more popular social media platforms like Facebook have sparked question of whether social media is safe, or how platforms like Facebook, Twitter, YouTube and Twitch would be able to stop the spread of false, misleading, or otherwise dangerous information before it’s spread. Since anyone can post anything as they please without any sort of filtration system that stops content that breaches Terms and Conditions before it gets posted, questions of how to combat the issue have risen in recent months.
Facebook, which has been the center of most controversy surrounding social media in recent years, has dedicated whole teams toward combating the presence of things like fake news, disinformation, and illegal activity on its site. Others, like Twitch, will have to follow suit.