When the New Zealand shooter attacked a Christchurch mosque last week he live streamed the event on Facebook. The shooting, which left 50 people dead and counting, is one of the deadliest in the nation’s history. Facebook has admitted that, although only around 200 people saw the event during its live stream, as many as 4,000 saw the video before it was taken down.

Not Reported

According to the social media company, no one reported the shooting during the live stream. The first person to report the video came 29 minutes after the live stream began. By then, too much blood was shed. The shooter had uploaded a link to the live stream video with a message about the attack on a chatroom called 8Chan.

It took no time at all for people on the chatroom that had been following the information about the shooter to find the video and aggressively try to save it. Users tried to download and re-upload the video into areas of the web that would make it harder to take down. Even after Facebook took it down, they couldn’t stop the video from being viewed in other facets at least 4000 times.

The plan was to saturate the web with the video, uploading it as many times as possible in different places. The users involved went to different parts of the web—including Facebook—to upload the video wherever they could. The plan worked, and the video was uploaded as much as 1.5 million times onto Facebook itself.

Out Of Control

Once the video was originally posted as a live stream it became impossible to contain how many times it would be seen on the web. Facbook has confirmed that as many as 4,000 people saw the original post before it was taken down. That number, however, doesn’t account for where else the video could have been seen throughout the vastness of the internet.

The Christchurch killings saw 50 people murdered during daily prayer at a local mosque. In the wake of the shooting, the New Zealand government has worked tirelessly to help the victims’ families and local community. The country immediately banned assault weapons, pledged to pay for the victims’ funerals and families, and is working hard to help a community move on from a mass tragedy.

Facebook has been working hard to block the video of the attack from being uploaded again. But even then, some versions still slip through the cracks. The video was uploaded as many as 1.5 million times. But only 1.2 million of those times saw the video being stopped during its upload. Facebook is investing more resources into moderation, but will that be enough?

Massive Flaws

At the very least, what the Christchurch attack video proves is that Facebook is struggling to contain the very essence of what made it so lucrative as a company. Social media’s massive presence around the world opens it up to issues surrounding moderation. The double-edged sword of being a nearly omnipresent company is that it no amount of moderation will nearly be enough. There are simply too many users.

Facebook, among other social media and user-based web platforms, have faced a lot of criticism in the past. The sites have been criticized for not immediately responding to posts that violate their terms of use. Videos depicting these types of violence are prohibited on the sites. However, it can be hard to block them from being seen immediately. With humans moderating the platforms, the error margin is bigger than it should be.