The Domino Effect: How Social Media Staged A Coup

Published on January 7, 2021

The violent insurrection and attempted coup that took place at the U.S. Capitol building in Washington, D.C. on January 6th was the culmination of months of organization that took place on social media platforms such as Twitter, Reddit, Youtube, Gab and Parler. Over the last year, millions of right-wing social media users have quickly been indoctrinated into online extremist movements—largely thanks to vague terms and conditions put in place from the social media companies that play host to their communication.

In March, when millions of Americans suddenly found themselves at home with more time to spend online, social media platforms saw significant growth from users that were once less active or didn’t have accounts at all. Apps like TikTok went from being niche playgrounds for generation z to becoming household names (so much so, that TikTok was nearly banned in the United States in 2020 after activists were using it throughout the summer). Between quarters 1 and 2 of 2020 Twitter saw a 20 million increase in daily active users, meaning people that access the website and interact with it on a daily basis. Facebook, meanwhile, saw a bump of about 77 million daily active users during that same time, compared to steady growth of around 36 million per quarter in the years prior.

Suddenly in need of a new way to pass the time, users that once filled their days with blue collar work or small business ownership were now at home, left with little else to do but spend more time on social media. At the same time Facebook, still mending its public image after the Cambridge Analytica scandal incited widespread user distrust, was in the midst of a campaign to encourage users to spend their time in Facebook groups. The campaign, which emphasized the sense of community that Facebook users could find in groups, encouraged them to interact with likeminded individuals that shared their interests in anything from bizarre internet comedy to conspiracy theories. The latter of which really, really took off.

While the world navigated its way through the throes of a global pandemic, often seeing health officials retract or change guidelines in real time, many began to question whether the pandemic was real at all. Conspiracy theorists claiming to be experts on social media blasted warnings from health officials that masks could properly protect from the virus. Many spread false allegations that COVID-19 vaccines had ties to satanic cults to incite fear in conservative Christians, and others insisted that President Trump was using the virus to cover up a secret operation to return young human trafficking victims back to their loved ones.

Citing pictures allegedly taken from inside makeshift COVID-19 hospitals in places like New York City and Los Angeles at the time, users argued that the hospitals were set up for psychological trauma healing, not COVID-19. Entire Facebook groups dedicated to protesting mask ordinances around the United States used language that echoed what was coming from the Trump campaign. Posts and viral memes used words like “patriots” and “freedom” to describe those that defied mask mandates, arguing that it was a breach of their individual liberties as American citizens to wear a mask while in public. Since many of these people were veterans, or families of veterans, the sentiments echoed by these groups gave them an increased sense of identity and confirmation bias—it helped them feel united and empowered by a greater cause.

By May, when an entirely new crisis would begin to unfold, users that had felt united by their fears and beliefs would only feel further disenfranchised by sudden calls for change that felt radical to them—like defunding police departments around the United States or integrating critical race theory into public education. The Trump campaign, aware that his supporters would look to him for guidance as protests sparked around the United States, saw an opportunity to reinforce the language of fear that began to ravage social media users in suburban and rural America. Rather than call to unite America during times of crisis, Trump instead sought to use the widespread civil unrest as a way to speak only to his supporters.

The President, armed with a Bible and a militia, leaned further into rhetoric that sought to divide America. He began integrating openly racist policies into his promises for a new term, telling his suburban supporters—specifically suburban white women—that he would protect them from the low income housing that Democrats sought to build in their communities as a way to further end racial segregation caused by redlining in the mid-20th century.

At the same time, the Trump Administration sought to categorize the decentralized anti-fascism movement as a terrorist organization, further cementing the narrative that was being used by right-wing media to describe a now global civil rights movement. The movement intended to draw attention to police violence and institutionalized racism, but those beliefs directly contradicted with those that Trump built his power on. While Trump promised to rid suburbs of low-income and accessible housing opportunities, he also drew attention to left-leaning activists, branding them with frightening buzzwords like “Antifa thugs” and “Marxists” on social media, where his impassioned following was only growing more angry and more threatened by the day.

Falsified pro-Antifa flyers have been circulating social media off and on in recent years.

During this time social media was flooded with activism. On Twitter, the divide between right and left only stretched further while both movements grew fairly quietly. With concerts, theme parks and events shut down for the summer, many of the same people that were suddenly using social media in March maintained their newfound hobbies. Boiling unrest in major U.S. cities sparked discourse on Twitter. But aside from the few stragglers that were able to trickle between the two social groups, both carried on fairly unaware of what the other side was up to and the echo chambers only grew.

On Instagram, however, users were being bullied by their peers for posting content that didn’t advocate for the movement for racial justice. Influencers that opposed Black Lives Matter were being met with criticism from followers, and those that did not speak up about the movement were left to be forgotten during a time when social media users were more interested in advocating for a cause than they were buying a product. Twitter, which was always divided fairly politically, didn’t see new division quite like Instagram, where users were able to easily avoid discussing politics if they didn’t want to. People were there for the photos, and up until this time even Instagram Stories was seldom used for anything but making life look pretty good to your followers. Now that everyone—especially communities of young people on Instagram—had nothing to do but look at the systemic problems that America had refused to face for so long, right-wing social media users were suddenly disenfranchised from the mainstream.

That is, until Ghislaine Maxwell was arrested in July over sex trafficking allegations in relation to the disgraced financier Jeffrey Epstein. The arrest—which, admittedly, sparked intrigue in just about everyone—was the second catalyst that introduced fringe conspiracy theories like QAnon and #Pizzagate to the masses through social media in 2020. Just days after Maxwell was arrested in New Hampshire, a new conspiracy theory claiming that the e-commerce website Wayfair had been engaging in child sex trafficking went viral on social media. On Twitter and TikTok, videos and posts strung together various clues to create massive allegations that the site was being used to sell children into sexual exploitation.

Within days, the now-debunked conspiracy theory had attracted notable figures from within the anti-trafficking industry. Tim Ballard, the controversial founder of Operation Underground Railroad, used the Wayfair conspiracy theory to spread awareness of human trafficking and sexual exploitation. In a YouTube video, Ballard discusses the ways in which children are commonly trafficked, leaning into the Wayfair conspiracy theory to serve as a confirmation bias and prey on passionate believers that were suddenly facing a moral panic.

“Children are sold that way,” he says in the video before discussing his history has an undercover operative. Ballard once worked alongside Donald Trump and uses human trafficking to advocate for tighter immigration policies. He then credits the COVID-19 economic shutdowns as a reason for an increase in child trafficking cases worldwide, though there is no reliable data to corroborate this. The latest State Department reports only include data through 2019, but reports of missing children in the United States often use skewed data and words like “likely missing” to imply that the numbers are far greater than they really are.

Meanwhile, organizations claiming to fight for victim advocacy rake in millions of dollars each year without much evidence to show that they actually help anyone at all. In October, it was revealed that Utah officials were investigating Operation Underground Railroad for allegedly taking credit for work that police did. Experts that deal with victims of child trafficking, and many sex worker advocacy groups, are vocal in opposing organizations like Operation Underground Railroad. Despite this, the organization’s viral success on social media from riding the coattails of extremist conspiracy theories have earned it millions of dollars in donations each year—much of which is spent on things like cars and fitness equipment, though OUR clarifies that these things are needed to undergo raids on sex work facilities.

The only way to save the children, Ballard argues, is to donate to organizations like Operation Underground Railroad that are fighting to end what it calls “modern slavery,” and learn how to recognize warning signs of human trafficking out in the world (victims advocacy groups are vocal in clarifying that these warning signs are often filled with racist tropes such as white parents with Black children, and vice versa). Within days of Ballard posting the video to YouTube, Instagram was flooded with informational graphics on child trafficking that contained misinformation. Operation Underground Railroad, thriving off of its sudden viral success, organized a virtual fundraiser for the World Day Against Trafficking Persons on July 30th.

via Instagram

The Rise Up For Children event, which the organization says was inspired by Ballard’s response video to the Wayfair conspiracy theory, urged social media users to participate in a fight to end human trafficking. In the event’s FAQ section, the organization addresses its stance on the Wayfair accusations and says “If there is any substance to it, that would be left to law enforcement to investigate. The fact remains that children are sold online every day through various outlets, and this reality is a tragedy.”

It was just days later that social networks like Instagram had to ban hashtags often associated with the movement—and the organization—because they were being used to spread misinformation and indoctrinate users into conspiracy theories like QAnon. But by this time, thousands of social media users that had once been politically quiet were suddenly extremely passionate and vocal about a new issue: saving the children. The online movement—which often transferred to in-person demonstrations in U.S. cities—quickly became a gateway for right-wing extremism on places like Instagram and Twitter, where discussions about Ghislaine Maxwell and sexploitation often tied into allegations that political and Hollywood elites were involved in some highly secretive organized crime.

via Instagram

For parents in suburban cities, social media served as a gateway to information that they felt they were not getting from reputable news outlets. Discussions on Twitter threads and SubReddits dedicated to ending human trafficking served as a path to indoctrinate interested and impassioned men and women into an online extremist cult that Trump refused to condemn while on the campaign trail. When organizations like Operation Underground Railroad all but confirmed their worst fears—that children were being kidnapped and sold en masse every single day—while advocating for a Presidential candidate, it only served to push them further down the QAnon rabbit hole.

By the fall, widespread crackdowns on conspiracy theories like QAnon on social media sites like Facebook were too little too late, and by that time Trump had catered his campaign to hint at loyal QAnon believers that he had a secret plan up his sleeve by regurgitating posts from its biggest leaders. Millions of social media users that depended on the Facebook groups they had become part of in recent months were suddenly gone.

Left with a sudden and gaping opportunity to create a centralized chatroom that allowed open and free discussion of things like QAnon and what it stood for, companies like Parler moved in. The social network, along with Gab, promised not to moderate content in the ways that Facebook and Twitter did, enabling millions of QAnon and Trump loyalists to discuss the movement without fear of being censored. Parler saw an increase of one million users in just one week in June of 2020, but the real migration to the app began when notable figures within the QAnon community were facing bans on Twitter and Facebook in the fall.

Via Lin Wood on Parler

Each time a right-wing leader or influencer would be removed from a social network like Twitter or Facebook, they would move to Parler and urge their followers to spread the message that they had found a new platform elsewhere. Often, entire posts from Parler (that would otherwise, definitely, break terms of use rules on Twitter) would circulate Twitter as screenshots from lesser-known QAnon “influencers.”

While mainstream news outlets were focused on what President Trump or his children said on Twitter, right-wing influencers that often echoed more dangerous versions of things that Trump would say were flying relatively under the radar to those that were not looking for them. People like Lin Wood, a lawyer for President Trump according to his Wikipedia page, and Dan Bongino, a conservative political commentator that claims to hold stake in Parler, served as just a few of the many unofficial leaders of a widespread and decentralized network of influencers that acted as mouthpieces for right-wing propaganda and extremism. Posts that said things like “Barack Obama was the most corrupt President in history. RT if you agree!” served as viral introductions to the accounts that would later spread harmful conspiracy theories and election misinformation. Users that agreed with the statement would retweet and follow, only to be spoon fed shocking, sensational conspiracy theories and that came with a warning that mainstream news would never cover what was truly going on. Eventually, accounts like Lin Wood would succumb to posting outright calls to assassinate political figures, including Vice President Mike Pence, after he would not object to the election results on January 6th.

In the months leading up to the January 6th uprising at the U.S. Capitol, far-right influencers—each with hundreds of thousands or millions of followers on platforms like Twitter—urged believers to gather in Washington to protest the results. In the days leading up to the historical event, extremist accounts with a heavy presence on Twitter—many of which are verified by the app itself—would post things like “January 6th is going to be Biblical,” alluding to the events that would later take place. In some cases President Trump himself urged his supporters to gather on January 6th, but it was through accounts like Lin Wood—that represented themselves as being within the close social circle of the President—that the real calls for violence and insurrection took place.

“Get the firing squads ready. Pence goes FIRST,” wrote Lin Wood on Parler late on January 6th after Congress was able to reconvene. Supporters felt betrayed by Vice President Pence after Trump wrote on Twitter that Pence was his last hope in securing the Presidency (a claim that was later determined false by fact checkers at various news organizations). Instantly, posts began to flood Twitter and Parler calling Pence names like “Judas,” further implying that Trump’s supporters see him as a leader of Biblical proportion. Though social networks have clear policies on violent threats and threats to civic integrity, the scale of the problem cannot be contained to an effective degree.

A spokesperson for Twitter responded to Grit Daily’s request for comment on whether the company would tighten its ban on conspiracy theories in the wake of what happened on January 6th. “As we did throughout the 2020 US President Election, our teams are work in partnership with election officials and take strong action to protect the online, public conversation happening around elections in countries around the globe,” said the spokesperson.

The company implemented several new strategies to curb the spread of misinformation amid the 2020 election, but still offers only three options for reporting political content on the app. None of the three reasons—which include options such as “false information on how to vote,” voter suppression, or misrepresentation of affiliation with an elected official—create a clear path for reporting misinformation or online political extremism.

When its CEO Jack Dorsey announced that Twitter would not allow political advertisements in the wake of Facebook’s Cambridge Analytica scandal, the company failed to address that its problem was never paid advertisements in the first place—it was misinformation coming from popular accounts through organic content.

The fringe conspiracy theories that once sat in the dark corners of sites like 8Chan are now, often, trending topics on Twitter under dog whistles that can only be recognized by the very people that are a part of them. Phrases such as “squash Antifa scum,” when searched on Twitter, turn out thousands of violent threats that have circulated the platform for months. In many cases, calls for outright terrorism against left-leaning activists go unnoticed for weeks, or are hidden in the replies to organization efforts from leaders that represent themselves as being close to President Trump in real life.

Too often, social media platforms take an issue seriously only after they’ve lost their ability to contain it. In 2018, Facebook only admitted that its advertising platform enabled the illegal data mining of information from its users after it was used to influence the outcome of the 2016 election. But while Twitter and Facebook release statements that they’re suspending President Trump and are condemning the violence sparked by the President at the U.S. Capitol on Wednesday, calls for more violence and uprising flood right-wing networks of loyal followers that are now scrambling to find out what their leader wants from them.

Editor’s Note: Grit Daily has chosen to keep some sources for photos in this piece anonymous as not to enable the spread of violent extremism. This article was originally published on January 7th, 2021. It was updated on January 8th with a response from a Twitter spokesperson.

Julia Sachs is a former Managing Editor at Grit Daily. She covers technology, social media and disinformation. She is based in Utah and before the pandemic she liked to travel.

Read more

More GD News