Despite a crackdown by Twitter, there were more than 93,700 QAnon-related accounts still on the platform as of October 15, according to new data from non-partisan nonprofit Advance Democracy.
In July, Twitter removed thousands of accounts linked to the QAnon conspiracy group and said it would "permanently suspend accounts Tweeting about these topics" and "coordinating abuse around individual victims." Last month, Facebook said it would ban any pages, groups and Instagram accounts representing QAnon, and YouTube has also taken similar actions to limit its spread.
QAnon believers have embraced a number of different and often contradictory theories, but the basic false beliefs underlying the far-right conspiracy theory are claims about a cabal of politicians and A-list celebrities engaging in child sex abuse, and a “deep state” effort to undermine President Trump.
Even with efforts to combat QAnon content, accounts associated with the group are among the most active on Twitter in battleground states ahead of Election Day.
"QAnon continues to have a substantial influence on the stories and narratives promoted on social media -- and that’s especially true when it comes to conversations about this election. These accounts are promoting right-wing fringe conspiracy theories, election disinformation, and divisive content at alarming rates," said Daniel J. Jones, president of Advance Democracy, and a former FBI analyst and Senate investigator.
"To date, through our work to deamplify content and accounts associated with QAnon we have reduced impressions on QAnon-related tweets by more than 50%, meaning our users are seeing less unhealthy content on their feeds as a direct result of this cross functional effort," a Twitter spokesperson told CNN Business. "As always, Tweets are subject to all of the Twitter Rules and we will continue to take the necessary additional enforcement actions when shared content violates our policies."
- QAnon is inserting itself into the election conversation: About 4.1% of Texas-based posts about the 2020 election came from QAnon-related accounts (that's 718,900 posts); 3.5% of the Florida-based posts (541,400), and 3.7% of the North Carolina-based posts (194,600). Meanwhile, 2.3% of Pennsylvania-based posts (159,700) about the 2020 election came from QAnon-related accounts.
- How the analysis worked: Posts were determined to be about the 2020 election if they included terms or hashtags such as: voting, election, mail-in, “#riggedelection,” “#votersuppression” and so on. Locations were defined as any tweet, retweet, quote tweet, or reply by an account whose location is set to that state, along with every post geotagged in the state or accounts with references to certain towns or cities in that state in their Twitter bios. The report analyzed Twitter activity from January 1 to October 15, and its analysis only included accounts that remained active on Twitter as of October 20.
- In the report, Advance Democracy said its definition of “QAnon-related” accounts was “extremely conservative” because it includes only those Twitter accounts with an explicit reference to QAnon in their Twitter bio, such as hashtags like #QAnon or terms associated with the conspiracy movement including “the great awakening” or “where we go one we go all.” As a result, it believes the proportion of the conversation on Twitter connected to those who align themselves with QAnon is likely much higher than what its report found.