The New Zealand Live Stream
When a gunman opened fire on a Christchurch mosque back in March, many were able to see the attack via a Facebook live stream that went unnoticed and unreported on the website for hours. Facebook admits that around 4,000 people may have seen the video before it was removed. However, only 200 saw the video when it was technically still live. In that time, few or no people reported the video, leaving it up for the world to see and reproduce. Once the video was on the site, users could download it or take a screen grab of the footage. From there it could be re-uploaded onto different parts of the web.
Speculation about these changes goes back to the massacre itself. The event killed about 50 people when it happened in Christchurch, New Zealand. An open letter to the New Zealand Herald, written by the Facebook COO Cheryl Sandberg, was published shortly after the attacks. The letter apologizes for the video’s presence on the web and takes responsibility for how long it was up. It also addresses changes that the company promised to make in order to prevent these situations in the future.
It’s not that Facebook is watching, it’s that Facebook users are. The company will stand by its reporting policy that gives users the freedom to report content that doesn’t sit well. However, Facebook Live will be a lot more strict than it used to be. Users will be given one strike to abstain from spreading hateful or violent content on the platform. For example, if a user pastes a link to a known terrorist organization’s website within their live stream, or gets reported for spreading hateful messaging, their Live privileges will be removed for a certain period of time. Facebook feels that this new policy will combat potential videos from being put onto the website.