After Parler went offline last month, the social media company promised it would be back soon—though no one held their breath as far-right activists, influencers and QAnon believers rushed to apps like Telegram to discuss political happenings. Parler was unable to implement a timely and robust moderation policy in the days following the January 6th attack on the US Capitol. The company finally revealed this week that it’s back online with a new CEO and a website that doesn’t really work. I would know, I tried to log in and nothing happened.
The Twitter-like social network favored by right-wing conservatives advertises itself as a place to “Speak freely and express yourself openly, without fear of being ‘deplatformed’ for your views,” according to the slogan on its home page. The new website boasts some significant changes to the old version that was available as an app, including a new logo and a new CEO, Mark Meckler.
Meckler, aside from being the interim CEO of a now-infamous social network, was a founder in the Tea Party Patriots—a right wing political group that, notably, helped organize the March To Save America rally that later turned into the January 6th attack on the US Capitol. The group is also known for being staunchly opposed to Obamacare, and famously supports the Trump-backed COVID-19 treatment, hydroxychloriquine.
Regardless of its connection to some of the biggest players in the pro-Trump world, chatter about the return of Parler is pretty lackluster on other popular right-wing internet gathering places. On Telegram, talks of moving back to Parler seem largely absent this week, and criticism about the app’s low quality seemed to dominate the Parler discourse in the weeks following its removal from app stores and deplatforming by Amazon Web Services, its former host.
Even if Parler manages to make it through this hurdle, it still has some significant challenges ahead. Notably, the company is going to have to make a decision regarding its moderation policy if it wants to get back on app stores. “Parler will need a strong content curation policy in order to be accepted as a safe and legitimate website not a purveyor of disinformation and misinformation,” says David Reischer, attorney and CEO of legaladvice.com.
There is no law that requires a social media company to moderate content, but companies are granted legal protections by Section 230 to be able to dictate what is and isn’t allowed on their platforms. An internet company can’t be held liable for the content that users publish on its forum, so long as the content isn’t illegal on a federal level. This includes things like child pornography or copyright infringement.
In recent years, however, there has been an increase in discussion on the role that social media companies play in political manipulation. The 2016 Cambridge Analytica scandal upended the modern discourse on Section 230, prompting most social media platforms to take a stricter stance on the spread of misinformation and politically manipulative content. Amid the COVID-19 crisis and 2020 Presidential Election in the Untied States, criticism toward companies like Facebook and Twitter only mounted even further.
Parler argues on its website that biased content-curation policies (which is another way of saying moderation) “enable rage mobs and bullies to influence Community Guidelines,” and says that its policies are a nonpartisan approach to content moderation. As a direct criticism of other tech companies like Twitter and Facebook, Parler is turning its stance on content moderation—or lack thereof—into a form of First Amendment activism that is, once again, misleading to its audience.
“The right to express oneself is correlated to the freedom of speech offered by the First Amendment,” Reischer says. “That right allows an individual to express themselves without fear of censorship or moderation. This right does not extend to the right of expression on private online technology platforms such as Facebook and Twitter. The First Amendment only protects an individual from freedom of expression being limited by the government,” says Reischer.
Parler isn’t required to moderate content that aims to manipulate the masses politically, but with an increased lens on social media in recent years because of this very topic, the platform will likely become a hotbed for political discourse on making changes to Section 230 (a move that has recieved bipartisan support in recent months, albeit for different reasons). That is, if it even manages to take off.