As the 2020 election heats up, social media sites and tech companies have had to fend off disinformation campaigns designed to influence the vote. But this year in particular, tech giants have a new — and quickly spreading — concern: The QAnon conspiracy theory.
What is QAnon?
QAnon is a baseless far-right conspiracy theory that originated from an anonymous figure designated as “Q” on the message boards of 4chan in October 2017.
The unknown figure claims to be a high-level government official within the Trump administration with access to classified intelligence, dropping intermittent, coded hints for supporters to decipher. When 4chan was disbanded in 2018, the Q “drops” moved to 8chan, and now are reported to take place on 8kun.
Supporters of QAnon largely believe in a widely debunked theory that President Trump is fighting against liberal Washington, D.C., politicians and Hollywood elites engaged in Satanism, sex trafficking, and pedophilia — a so-called “Deep State” of forces trying to undermine him.
None of Q’s predictions have come true, but supporters still cling to the conspiracy and are able to quickly adapt to growing online trends to spread disinformation. QAnon supporters have also given older, regurgitated conspiracies new life, like #Pizzagate — which began circulating during the 2016 presidential election around the disproven accusation of a sex-trafficking ring being run out of a Washington, D.C., pizzeria.
The coronavirus pandemic has also become a talking point for the group, who falsely assert COVID-19 is a “bioweapon.” More recently, QAnon supporters have co-opted the #SaveTheChildren hashtag to promote the false belief that “elites” use the blood of children to extend their lifespans.
How has QAnon spread?
It is unknown how many people seriously believe in QAnon conspiracies. Experts who study conspiracy theories told Digital Trends that QAnon supporters are “more deep than they are wide” — meaning those who are invested in QAnon theories are devotees of the ideology, while the majority of Americans have no idea of what the conspiracy completely entails, or have only heard parts of it.
However, thanks to social media platforms like YouTube, Facebook, Twitter, Instagram, and TikTok, QAnon is no longer a fringe group, but mainstream. Posts and accounts surrounding the QAnon conspiracy theory have garnered thousands of clicks in engagement from users — oftentimes, unintentionally promoted by the sites’ algorithms.
Supporters have used popular social media platforms to coordinate troll-like behavior and attacks, promote their hashtags, and swell private groups into the millions, recruiting more people by the hour — and ushering its rhetoric out of the shadows of the dark web.
What are Facebook, Twitter, and others doing against QAnon?
Tech giants have only just begun to crack down on QAnon accounts, groups, and advertisements after rampant online conspiracies led to real-world violence.
Twitter, the first to crack down on the group, banned hundreds of thousands of QAnon accounts and blocked its popular keywords and symbolism from trending, like, for example “#WWG1WGA,” which means “where we go one, we go all.”
TikTok disabled popular QAnon-related hashtags just as teenagers on the platform were beginning to take notice.
Facebook recently said it would not allow QAnon groups to purchase advertisements and took down hundreds of accounts spewing conspiracies on the site. And YouTube announced it down-ranks QAnon content from appearing on its home page, although Digital Trends recently found that QAnon content still appeared in users’ feeds.
Despite the efforts of social media platforms, QAnon supporters are agile and coordinated and have proven adept at dodging Big Tech’s efforts to limit the conspiracy theory’s spread. QAnon content can be found quite easily, if you know what to search for, bypassing any tweaked algorithms sites may make.
Even with increased content moderation, social media companies are unable to completely wipe QAnon content from their platforms without violating their own policies.