Facebook and YouTube take action against QAnon — but they’re fighting a losing battle


Recent decisions by Facebook and YouTube to crack down on the far-right conspiracy theory movement known as QAnon will disrupt the ability of dangerous online communities to spread their radical messages, but it won’t stop them completely.

The announcement by Facebook on Oct. 6 to take down any “accounts representing QAnon, even if they contain no violent content,” followed earlier decisions by the social media platform to down-rank QAnon content in Facebook searches. YouTube followed on Oct. 15 with new rules about conspiracy videos, but it stopped short of a complete ban.

This month marks the third anniversary of the movement that started when someone known only as Q posted a series of conspiracy theories on the internet forum 4chan. Q warned of a deep state satanic ring of global elites involved in pedophilia and sex trafficking and asserted that U.S. President Donald Trump was working on a secret plan to take them all down.

QAnon now a global phenomenon

Until this year, most people had never heard of QAnon. But over the course of 2020, the fringe movement has gained widespread traction domestically in the United States and internationallyincluding a number of Republican politicians who openly campaigned as Q supporters.

I have been researching QAnon for more than two years and its recent evolution has shocked even me.

What most people don’t realize is that QAnon in July and August was a different movement than what QAnon has become in October. I have never seen a movement evolve or radicalize as fast as QAnon — and it’s happening at a time when the socio-political environment globally is much different now than it was in the summer.

[Read: British grannies, please stop with the QAnon memes on Facebook]

All of these factors came into play when Facebook decided to take action against “militarized social movements and QAnon.”

In the weeks leading up to the ban, I had seen a trend in more violent content on Facebook, especially with the circulation of memes and videos promoting “vehicle ramming attacks” with the slogan “all lives splatter” and other racist messages against Black people.

In explaining its ban, Facebook noted while it had “removed QAnon content that celebrates and supports violence, we’ve seen other QAnon content tied to different forms of real-world harm, including recent claims that the (U.S.) West Coast wildfires were started by certain groups, which diverted the attention of local officials from fighting the fires and protecting the public.”

Prior action was ineffective

Prior to the outright ban, Facebook’s earlier attempts to disrupt QAnon groups from organizing on Facebook and Instagram were not enough to stop its fake messages from spreading.

One way Q supporters adapted was through lighter forms of propaganda — something I call Pastel QAnon. As a way to circumvent the initial Facebook sanctions, women who believe in the QAnon conspiracies were using warm and colorful images to spread QAnon theories through health and wellness communities and by infiltrating legitimate charitable campaigns against child trafficking.