Over the past few years, the QAnon movement, a baseless conspiracy theory referring to the belief in a cabal of deep state leftist pedophiles, has gained tremendous traction — and extremist researchers believe it’s largely been due to social platforms turning a blind eye to such content.
That seems to be changing, however, now that Facebook announced in a press release on Tuesday that it would be removing “any Facebook Pages, Groups and Instagram accounts representing QAnon, even if they contain no violent content.”
The announcement is an update of a policy rolled out back in August, when Facebook announced that it would be removing more than 1,500 pages and groups related to QAnon. The company said the initiative was part of a new set of measures designed to “disrupt the ability of QAnon and militarized social movements to operate and organize on our platform.”
Its efforts, however, fell short, according to critics, with the New York Times reporting last month that 100 QAnon groups it was tracking had seen growth by more than 13,000 followers in the month following the implementation of the new policy. The explosive popularity of #SaveTheChildren, an anti-child trafficking movement that helped mainstream the QAnon community, also contributed to its renewed popularity on social platforms.
“Facebook has made bold statements in the past regarding their commitment to reducing the spread of hate speech and misinformation on the platform, but their enforcement has been inconsistent and largely ineffective,” says Kathleen Stansberry, assistant professor of strategic communications at Elon University. “It remains to be seen if Facebook will follow through on their promise and devote the necessary time and resources to prevent the QAnon conspiracy theory from building traction on the site.”
In its press release, Facebook said that “several issues” led to the platform updating its policies regarding QAnon, such as QAnon “content tied to different forms of real world harm,” citing misinformation about the cause of the West Coast wildfires as an example.
QAnon is a baseless conspiracy theory referring to the belief shared among some far-right extremists that President Trump is lying in wait to overthrow a cabal of deep-state leftists, including such prominent figures as Hillary Clinton and Barack Obama.
Over the past year, the community has increasingly become mainstream, with congressional candidates espousing the conspiracy theory and President Trump himself making a public statement that appeared to embrace the movement. Most recently, following Trump’s COVID-19 diagnosis, QAnon believers promoted the idea that his illness was a hoax, providing a front for the president to overthrow his detractors.
In the past, Twitter and TikTok have made cursory efforts to scrub QAnon believers from their platforms, largely by removing individual accounts or banning QAnon-associated hashtags. Facebook’s most recent announcement represents one of the most sweeping actions taken by a platform to combat the conspiracy theory.
“Facebook’s move to ban groups, pages and Instagram accounts that represent QAnon won’t scrub the conspiracy theories that fuel the movement from the web,” says Stansberry. “But the decision could force believers back to the more niche discussion boards where the movement began.”