Misinformation Watch

By Donie O'Sullivan, Kaya Yurieff, Kelly Bourdet, the CNN Business team and contributors from across CNN

Updated 11:21 a.m. ET, January 26, 2021
37 Posts
Sort byDropdown arrow
6:54 p.m. ET, November 2, 2020

New report finds nearly 94,000 QAnon accounts are still on Twitter

CNN Business' Kaya Yurieff

Despite a crackdown by Twitter, there were more than 93,700 QAnon-related accounts still on the platform as of October 15, according to new data from non-partisan nonprofit Advance Democracy.

In July, Twitter removed thousands of accounts linked to the QAnon conspiracy group and said it would "permanently suspend accounts Tweeting about these topics" and "coordinating abuse around individual victims." Last month, Facebook said it would ban any pages, groups and Instagram accounts representing QAnon, and YouTube has also taken similar actions to limit its spread.

QAnon believers have embraced a number of different and often contradictory theories, but the basic false beliefs underlying the far-right conspiracy theory are claims about a cabal of politicians and A-list celebrities engaging in child sex abuse, and a “deep state” effort to undermine President Trump.

Even with efforts to combat QAnon content, accounts associated with the group are among the most active on Twitter in battleground states ahead of Election Day.

"QAnon continues to have a substantial influence on the stories and narratives promoted on social media -- and that’s especially true when it comes to conversations about this election. These accounts are promoting right-wing fringe conspiracy theories, election disinformation, and divisive content at alarming rates," said Daniel J. Jones, president of Advance Democracy, and a former FBI analyst and Senate investigator.

"To date, through our work to deamplify content and accounts associated with QAnon we have reduced impressions on QAnon-related tweets by more than 50%, meaning our users are seeing less unhealthy content on their feeds as a direct result of this cross functional effort," a Twitter spokesperson told CNN Business. "As always, Tweets are subject to all of the Twitter Rules and we will continue to take the necessary additional enforcement actions when shared content violates our policies."


  • QAnon is inserting itself into the election conversation: About 4.1% of Texas-based posts about the 2020 election came from QAnon-related accounts (that's 718,900 posts); 3.5% of the Florida-based posts (541,400), and 3.7% of the North Carolina-based posts (194,600). Meanwhile, 2.3% of Pennsylvania-based posts (159,700) about the 2020 election came from QAnon-related accounts.
  • How the analysis worked: Posts were determined to be about the 2020 election if they included terms or hashtags such as: voting, election, mail-in, “#riggedelection,” “#votersuppression” and so on. Locations were defined as any tweet, retweet, quote tweet, or reply by an account whose location is set to that state, along with every post geotagged in the state or accounts with references to certain towns or cities in that state in their Twitter bios. The report analyzed Twitter activity from January 1 to October 15, and its analysis only included accounts that remained active on Twitter as of October 20.
  • In the report, Advance Democracy said its definition of “QAnon-related” accounts was “extremely conservative” because it includes only those Twitter accounts with an explicit reference to QAnon in their Twitter bio, such as hashtags like #QAnon or terms associated with the conspiracy movement including “the great awakening” or “where we go one we go all.” As a result, it believes the proportion of the conversation on Twitter connected to those who align themselves with QAnon is likely much higher than what its report found.
3:46 p.m. ET, November 2, 2020

Here's what Big Tech employees are worried about on Election Day

CNN Business' Donie O'Sullivan

"The shift over time of election results as different types of ballots are fully counted is my biggest source of concern right now," one employee who works on countering misinformation for a major social media platform said on Saturday. "If it isn't a landslide one way or the other, every race that leans one direction and goes another is a potential flashpoint for offline violence."

The days -- and possibly weeks -- after Election Day will be a huge test for platforms like Facebook, Twitter, and Google's YouTube. Doctored videos that could potentially be spread by anyone; fake accounts that could pop up anywhere; and tweets from President Trump himself could all contribute to undermining the result of the election and perhaps even stoke offline violence.

CNN Business spoke to more than a dozen people who are either employees at the major social media platforms working on the teams countering misinformation and extremism or people who work directly with those teams at the companies.

CNN Business granted them anonymity so they could speak about their work more freely.

"My biggest fear at this point is something totally unexpected happening that no one predicted," one Big Tech employee said. "This year we've all been preparing and working through scenarios for every possibility that we can think of, but this year has taught me not everything can be predicted."

Read more here

3:46 p.m. ET, November 2, 2020

Fact check: Viral photo of Biden on a plane without a mask is from before the pandemic

CNN's Daniel Dale

On Sunday, Richard Grenell, President Donald Trump's former ambassador to Germany and former acting director of national intelligence, tweeted two photos side by side: one of former Vice President Joe Biden standing on an airplane without a mask and one of Biden standing outdoors while wearing a mask.

"Washington, DC phony! @JoeBiden doesn't wear a mask on a plane - but wears one OUTSIDE!?" wrote Grenell, a prominent Trump campaign ally and a paid Republican National Committee senior adviser. Grenell had more than 671,000 Twitter followers as of Monday.

Grenell's tweet was retweeted more than 16,000 times. And prominent right-wing talk radio host Mark Levin, who had about 2.6 million followers as of Monday, generated an additional 12,800-plus retweets by sharing Grenell's tweet on Sunday and adding his own accusation that Biden is a "fraud."

Facts FirstThe tweets by Grenell and Levin are egregiously deceptive: they create the false impression that the photo of Biden without a mask on a plane was taken during the coronavirus pandemic. The photo was actually taken in November 2019, before the pandemic.

Read more here

3:44 p.m. ET, November 2, 2020

False video of Joe Biden viewed one million times on Twitter

CNN Business' Donie O'Sullivan

A deceptively edited video of Joe Biden making it appear the Democratic presidential nominee forgot what state he was in was viewed more than one million times on Twitter over the weekend.

In the video, Biden addresses a crowd — saying, "Hello, Minnesota!" The event did, indeed, take place in St Paul, Minnesota.

In the unedited, original video, signs in front of and behind Biden on the stage read "Text MN to 30330" — making it clear the event was in Minnesota.

However in the false video, the on-stage signs were edited to read "Tampa, Florida," and "Text FL to 30330."

The video was shared on Twitter by a person who accused Biden of forgetting what state he was in.

Read more here

7:37 p.m. ET, November 1, 2020

Facebook cracks down on QAnon hashtag #SaveOurChildren

From CNN Business' Rishi Iyengar

Facebook will expand its action against QAnon by restricting #SaveOurChildren, one of the hashtags supporters of the conspiracy theory often append to their social media posts.

Starting Friday, the company will be "limiting the distribution" of the hashtag, spokesperson Emily Cain said in a statement to CNN Business, meaning that posts using the hashtag will have their visibility reduced in the News Feed and people clicking on the hashtag will not be able to see the aggregated results.

Instead, they will see a link to a list of "credible child safety resources," Cain added. 

Save The Children is a respected humanitarian organization that has been around for more than 100 years, but QAnon followers have hijacked and bastardized the name "Save The Children" as a way to spread baseless conspiracy theories about prominent Democrats, including former Vice President Joe Biden.

Posts about the conspiracy theory often include the hashtags #SaveTheChildren or #SaveOurChildren. Facebook said it will only be limiting the distribution of the latter for now, and will continue to monitor different hashtags and other methods by which QAnon supporters might try to continue evading detection.

Searching for #SaveTheChildren shows a prompt from Facebook asking if you're looking for the humanitarian organization with a link to its website. You also have the option of proceeding to the search results. Other similar hashtags show a link to a page with child safety resources in addition to the regular search results.

Children need to be "saved," Qanon followers believe, from a cabal of evil Democrats. It is essentially the same conspiracy theory that was pushed as part of "Pizzagate" in 2016 which falsely alleged a Washington DC pizza shop was at the center of a child sex trafficking ring.

The "Save The Children" charity has nothing to do with the QAnon and has publicly sought to distance itself from the conspiracy theory and its followers. Other child protection organizations have said these conspiracy theories are creating dangerous distractions from the real issue of child exploitation.

Facebook announced a ban on QAnon earlier this month, three years after the conspiracy theory first began. Twitter and YouTube have also imposed varying degrees of restrictions on QAnon content. 

But the platforms have allowed QAnon content to grow and spread for years. There are now multiple Republicans running for Congress who have expressed support for QAnon.

In August, President Donald Trump praised QAnon followers for supporting him.

"I don't know much about the movement other than I understand they like me very much, which I appreciate," Trump said in the White House briefing room.

Last year an FBI office warned that Q adherents are a domestic terrorism threat.

-- CNN Business' Donie O'Sullivan contributed to this report

12:56 p.m. ET, October 30, 2020

How Wikipedia will fight Election Day misinformation

From CNN Business' Kaya Yurieff

Staffers at Wikipedia's parent organization and the volunteer editors who maintain its millions of pages have a plan to ensure that election-related entries aren't improperly edited.

Last week, the Wikipedia community placed “extended protections” on the 2020 United States presidential election page, which means only experienced volunteers with at least 500 edits and 30 days on the platform can make changes. Other pages related to the election and presidential candidates already have protections, like the articles for Hunter Biden, the son of Democratic presidential nominee Joe Biden, Jared Kushner, President Donald Trump’s son-in-law, and the pages for both the Trump and Biden campaigns.

Generally, anyone can go into an article and make a change, however, there are varying levels of protections for what Wikipedia calls contested pages, which range from political topics to more obscure subjects over which editors disagree.

There are over 70 English-language articles about the 2020 election, according to the Wikimedia Foundation, Wikipedia's parent. It said more articles may be protected as Election Day nears.

Editors will be monitoring a list of relevant articles on Election Day and beyond. If someone makes an edit to those pages, over 500 people will get an email alerting them that there could be something worth checking.

Wired previously reported that editors have been actively discussing what measures they are considering for election night on a public page.

Since late August, some Wikimedia staff have been running through different scenarios of what could happen on its site during the election, such as how it would handle malicious content or a coordinated attack by multiple accounts making edits across several Wikipedia pages on Election Day.

“We are under no illusions that we will prevent every bad edit from making it onto the site," said Wikimedia chief of staff Ryan Merkley, who leads its new internal US election task force. "We think our responsibility is to make sure that we are as prepared to respond and that we can do it as swiftly as possible and ideally prevent its spread broadly.”

11:44 a.m. ET, October 30, 2020

Instagram hides recent posts from hashtag pages ahead of election

From CNN Business' Kaya Yurieff

Ahead of Election Day, Instagram has moved to temporarily restrict a popular way to browse posts.

Instagram announced that it will temporarily hide the "Recent" tab from showing up on all hashtag pages — whether they're related to politics or not. The company said it hopes the move will help prevent the spread of misinformation and harmful content related to the election.

Hashtag pages will still work, they'll just only show "Top Posts" as determined by the platform's algorithms. This may include some recent posts.

An Instagram spokesperson said the change was rolled out Thursday evening, and there is no specific timeline for when the action will end.

Other social platforms have also implemented similar temporary changes ahead of Election Day. For example, Twitter is encouraging users to quote tweet rather than to retweet, hoping people will add context or a reaction before spreading information.

9:02 p.m. ET, October 29, 2020

Twitter calls out Russian misinformation about US election

From CNN Business' Donie O’Sullivan and Marshall Cohen

A picture taken on June 8, 2018 shows an unidentified directors of the Russia Today (RT) at in their apparatus room in Moscow.
A picture taken on June 8, 2018 shows an unidentified directors of the Russia Today (RT) at in their apparatus room in Moscow. Yuri Kadobnov/AFP/Getty Images

Twitter labeled a video from the Russian-state controlled broadcaster RT as election misinformation on Thursday.

RT is registered with the US Justice Department as an agent of the Russian government. It is the first time Twitter has taken action against RT for US election misinformation in this way, Twitter confirmed to CNN.

The four-minute video posted by RT was titled "Questions mount amid voter fraud, rigging claims ahead of #USelection."

Twitter deactivated the retweet featured on the video, to reduce how much it can be shared, and slapped a label over it that read, "Some or all of the content shared in this Tweet is disputed and might be misleading about how to participate in an election or another civic process.”

The Kremlin uses RT to spread English-language propaganda to American audiences, and was part of Russia’s election meddling in 2016, according to US intelligence agencies.

A report released by the US intelligence community in 2017 said RT has historically "portrayed the US electoral process as undemocratic” and amplifies false narratives claiming that "US election results cannot be trusted.”

The four-minute video that RT posted Thursday touches on many of these themes. It raises concerns about “fraud” and echoes many of the lies President Donald Trump has spread about mail-in voting. Their segment cites Fox News, which has championed many of Trump’s attacks against the electoral process. It highlights isolated incidents of ballot mishaps, many of which have already been deemed by local authorities to be accidents and errors — and not fraud.

Earlier this year, an internal intelligence bulletin issued by the Department of Homeland Security said Russia was amplifying disinformation about mail-in voting as part of a broader effort "to undermine public trust in the electoral process."

11:58 a.m. ET, October 29, 2020

Facebook fact-checker calls the person who sent him a death threat

CNN Business' Gabe Ramirez

Facebook has hired a network of fact-checkers across America. CNN talks to two who have received threats for simply doing their jobs during the 2020 election cycle.