Facebook Announces New Oversight Board To Tackle Controversial Content On Its Platform

Published on September 21, 2019

As Facebook readies itself for another U.S. Presidential Election season, it prepares the network to enter a new era for social media: one that considers public safety and the spread of real information a top priority. Coming off of a nearly two-year wave of bad publicity, Facebook and its CEO Mark Zuckerberg are beginning to launch a new system that aims to help the social networking company prevent issues like the spread of disinformation, hate speech, and terrorism. This week the company announced the arrival of a new Oversight Board that will act as an independent moderator for content within Facebook’s network. Facebook hopes that the formation of a third-party oversight board will prevent issues of content governance while maintaining an environment where users are free to express themselves and openly communicate with one another.

CEO Mark Zuckerberg has been hinting at the arrival of an Oversight Board for quite some time now. In 2018, Zuckerberg took to the website’s newsroom to publish an open letter discussing the need for content governance in an age where social networks have become the most powerful tools of communication around today. “An important question we face is how to balance the ideal of giving everyone a voice with the realities of keeping people safe and bringing people together,” Zuckerberg wrote in the letter. “What should be the limits to what people can express?” While people are protected by Freedom of Speech in America, many of Facebooks users communicate beyond the boundaries of any official state border. Defining where the limits of free speech should end online is a grey area—one that no one government has been able to understand or implement restrictions upon thus far.

Facebook—much like any other social networking site—has its own set of community standards, but without a group of moderators to constantly police every last comment thread and status update, public safety remains, largely, up to the users to police themselves. In cases like the New Zealand mosque massacre, this meant that a live-streamed video of the attack went un-reported long enough to be seen hundreds of times by Facebook users. After it was removed by moderators, it had already been downloaded and re-distributed on other areas of the web where it was seen millions of times over the course of just a few days. Even today, it’s impossible to know whether a copy of the video is not still circulating pockets of the dark web. Moderation, additionally, becomes its own problems—as Facebook has had to deal with its moderators’ rapidly declining mental health.

Facebook’s Oversight Board Aims To Advocate For Facebook Users While Working To Protect Them In An Era Where Content Governance Is A Necessity.

An independent Oversight Board is the answer to Facebook’s seemingly endless conundrum: How deeply should a tech company be allowed to police online communities? The company took inspiration from Wall Street, as the structure of a public company requires an independent board that functions as a representation of its investors. A Board of Directors for a public company works with the company, while advocating for its investors, to keep the company accountable for its spending and strategy without taking a paycheck directly from the company itself. This keeps Facebook from having to dictate and define free speech on its platform by itself, and opens up the discussion to a group of elected experts on the subject.

The Oversight Board will likely not be sitting behind a computer each day combing through each and every one of your Facebook posts, though. The function of the board is not to police Facebook users, but to discuss and implement new community guidelines that prevent the spread of harmful information—from disinformation campaigns that aim to corrupt elections to extremist hate speech that fuels terrorist activity around the world. Human interaction over sensitive topics such as these is an integral part of the programs success, as technology has proven to be inferior when it comes to content moderating. A look at Tumblr’s attempt to moderate adult content on its platform or the apparent racial bias of certain content moderating algorithms on other platforms has proven that AI isn’t quite there yet when discussing censorship—an issue that not even Plato could solve.

The company announced the Oversight Board in an open letter published to its newsroom, where it highlighted the need for the group as well as how it plans to nominate members. Each of the 30-40 members of the Oversight Board will serve a term no longer than 5 years, giving each member a chance to take part in the oversight of content without being able to hold power for too long. If the board makes a decision to change Facebook’s content guidelines, the company has to implement the changes in a timely manner and does not hold power to amend them without approval of the board itself. Essentially, social networking has created its own internal government.

Julia Sachs is a former Managing Editor at Grit Daily. She covers technology, social media and disinformation. She is based in Utah and before the pandemic she liked to travel.

Read more

More GD News