Facebook announced changes to its 2020 Presidential Election strategy on Wednesday, and the press release is a treasure trove of information to unpack on how the social media giant anticipates Election Day will go. The company has struggled with the spread of disinformation and election interference in the past, and this update will bring new changes to how the platform interacts with democracy in the future–so let’s take a look at what it says:
Facebook has two, clearly defined goals ahead of the U.S. Election.
In the last couple of months we’ve seen sweeping change to Facebook’s policy on misinformation and conspiracy theories as QAnon saw a significant rise to the mainstream, in part, because Facebook didn’t have a clearly defined policy on coordinated inauthentic behavior until now. Facebook announced a ban on QAnon earlier this week, and has since come through with its promise to find and remove as much of the content as it can from the platform (though some are reporting that dozens of accounts and groups dedicated to the conspiracy theory are still there).
Today the company has made it a priority to find and disseminate information that aims to contribute to a wider scheme to interfere with the election as part of its election strategy. QAnon, a conspiracy theory that claims members of the Democratic Party are part of an elite cabal of pedophiles, is just one of the ways in which Facebook promises to fight misinformation.
“Our Dangerous Organizations Operations team will continue to enforce this policy and proactively detect content for removal instead of relying on user reports. These are specialists who study and respond to new evolutions in violating content from this movement and their internal detection has provided better leads in identifying new evolutions in violating content than sifting through user reports,” the company said in a press release on October 6th. The approach represents a significant change to how harmful content has been approached in the past, and won’t rely as heavily on user reports in order to find and remove content that could be considered harmful. This is a step in the right direction, but will ultimately take time to prove itself effective.
The company also announced new bans on content that seeks to misinform, intimidate or encourage voters to commit voter interference. After President Donald Trump recently encouraged voters to illegally watch polling places, the company is taking steps to clarify any sort of misinformation on what poll watching is, and whether its legal to show up to a polling place armed (in most places, it’s not).
“We thank the civil rights experts and community members who continue to help us understand trends in this area and we look forward to continuing to work with them,” the company said in the press release.
Facebook’s other key goal in its election strategy is to encourage users in the United States to register to vote. The company has already been encouraging users to vote for some time with a banner ad on the top of its News Feed within the app (it’s even become a sort of meme on social media recently). The company encourages users to check their registration status and gives them resources on how to register in their current state of residence if they are not registered already. Facebook estimated in September that it’s helped as many as 2.5 million people register to vote in 2020.
Updates to its Elections Operations Center
Facebook users were first introduced to the Elections Operations Center in 2018 during the U.S. midterm elections, and since then the company has rolled out the feature in other countries such as Brazil and throughout Europe to try and combat misinformation associated with elections around the globe. Users saw the feature again at the start of the COVID-19 pandemic when Facebook repurposed the Elections Operation Center to serve reliable information about the virus once it saw an uptick of misinformation on its platform. Once again, users saw the same feature more recently with Facebook’s new approach to climate change—again, to combat the uptick of conspiracy theories, climate change deniability posts and misinformation on the topic.
The Elections Operations Center, like the COVID-19 and Climate Change Information Centers, will be found at the top of the News Feed within the Facebook app. The center will offer fact-checked, reliable information about how to vote, whether voting by mail is safe (it is), and what voters can expect on Election Day. The company did not specify which dates the Elections Operations Center will be live.
One of the most jarring revelations in the press release on the election strategy was the announcement that Facebook has put safeguards in place to combat misinformation in case one candidate declares themselves the winner prematurely. The Elections Operations Center will also serve as a live ballot counter, offering accurate information in real time as a way to contest false declarations.
Facebook will ban all political ads … after November 3rd.
Perhaps the biggest update to Facebook’s election strategy is the revelation that the company will put a ban on all political advertising for the foreseeable future. “In addition, while ads are an important way to express voice, we plan to temporarily stop running all social issue, electoral or political ads in the US after the polls close on November 3, to reduce opportunities for confusion or abuse,” said the company in the press release. The company says the ban is temporary, but offers no insight into when that ban will be lifted, suggesting that it will ban political advertising until—or if—it comes up with a safer strategy. “We will notify advertisers when this policy is lifted,” Facebook said.
This represents a stark contrast to Facebook’s previous approaches to political advertising. Mark Zuckerberg, the company’s founder and CEO famously said that Facebook would not only offer political advertisements ahead of the 2020 Election, but that it would not be fact checking the ads either. The company received widespread backlash for this and, in response, Twitter CEO Jack Dorsey announced that Twitter would ban political advertising altogether—a strategy that has proven itself effective, but question remains on whether that was Twitter’s problem in the first place as the company has struggled in the past with bot accounts posting misinformation.
The announcement that Facebook will indefinitely ban all political advertising as part of its election strategy came on the same night that an ad premiered during the Vice Presidential debate. The ad targeted Mark Zuckerberg and asked Facebook users to question the ethics of Facebook’s role in politics around the globe. A group called The Real Facebook Oversight Board listed a ban on all political advertising in its list of demands, which Facebook complied with, but not before Facebook allegedly had the group’s website forced offline.