A Facebook Ban Won’t Stop QAnon


QAnon, the conspiracy theory that claims President Trump is secretly battling a Hollywood-Jewish-Democrat-deep state-globalist cabal of Satanist-murderer-pedophile-human traffickers, is huge. In both the span of its reach and the depth of its ideas, the conspiracy has grown into a juggernaut of misinformation. (“We call it a superconspiracy,” says Antonis Papasavva, a data scientist at University College London. “Name any conspiracy theory—JFK, MK Ultra, Pizzagate—it’s in there.”) This week, Facebook vowed to remove any pages, groups, or Instagram accounts that represent QAnon, which has gobbled up loads of engagement on the platform thanks to its something-for-everybody theories. Up until two months ago, Facebook didn’t really have any policies when it came to QAnon, and Tuesday’s ban marked a sharp escalation. Sharp, but also perhaps too late.

In case you are (blissfully) unaware, QAnon was born on the internet. Their prophet, Q, amassed followers by posting cryptic messages on 8kun, a message board popular with extremists, but the conspiracy theory has since seeped into every mainstream social media platform. Unlike a lot of conspiracy-minded internet subcultures, QAnon has had no trouble moving offline. At first, it was just T-shirts and mysterious billboards. Now QAnon has allegedly inspired criminal acts including murder and terrorism, been endorsed by multiple Republican congressional candidates, and had its followers praised as patriots by President Trump.

In August, after years of activists calling for Facebook to take a stronger stance against QAnon—which has promoted violence, anti-Semitism, racism, and Covid-19 misinformation on the platform—Facebook took a step forward. They announced that they would be restricting QAnon content by removing it from recommendation algorithms and taking down pages and accounts that discussed real-world violence. According to Facebook, the August crackdown led to the removal of more than 1,500 Facebook groups and pages, but QAnon has continued to flourish. Experts think it’ll go on flourishing, ban or no ban.

Facebook’s QAnon moratorium also has a cavernous loophole: It only targets entities that “represent” QAnon. “If I designate myself Queen of QAnon today, does that mean I’ll be removed?” asks Joan Donovan, research director at Harvard’s Shorenstein Center, where she studies online extremism. “I can’t see a world in which anyone is considered a representative of a conspiracy theory other than Q.” According to Facebook, QAnon “representatives” would have the word QAnon in their handle and bio or title and About section, and share QAnon posts to a degree that crosses a threshold that they’re not divulging. Deciding who and what checks those boxes will be left to Facebook’s Dangerous Organizations Operations team, which handles terrorists and hate groups. “It’s content moderation by press release,” says Donovan. The announcement is strong, but it’s unclear how wide-ranging or enforceable the new policy really is.

If you think that it would now be pretty easy to camouflage an abiding QAnon passion as a passing or even accidental interest by changing some words in your bio, you’d be correct. Plus, extremist groups are experts at going underground to escape public scrutiny. “I’m skeptical that this ban will have any impact in the long run,” says Phyllis Gerstenfeld, who studies online extremism and criminology at Cal State University Stanislaus. “Extremists find new ways to repackage themselves.” QAnon adherents already demonstrated their ability to do this when they hijacked the hashtags #SavetheChildren and #SaveOurChildren and used them to reach new audiences who would never have joined a QAnon group, but do care about kids.

As for hiding the actual QAnon label, that’s happening right now, this very minute. Even before the ban, QAnon groups were discussing alternate ways of identifying themselves to avoid detection and moderation. Tech-censorship doomsday strategizing is common to all online extremist groups, both because they constantly break terms of service and because it suits their paranoid worldview. In this case, people actually had orders to do so from on high: the user identifying themselves as Q told them to “Drop all references re: ‘Q’ ‘Qanon’ etc. to avoid ban/termination.” Some groups have been using “17” as a replacement callsign, but it will be something new by morning.



Source link

Amazon Prime Day 2020: the best early deals ahead of October 13 Previous post Amazon Prime Day 2020: the best early deals ahead of October 13
arXiv now allows researchers to submit code with their manuscripts Next post arXiv now allows researchers to submit code with their manuscripts