Twitter\n \n (TWTR) has removed thousands of accounts linked to QAnon, a group known for spreading conspiracy theories and disinformation online. “We’ve been clear that we will take strong enforcement action on behavior that has the potential to lead to offline harm,” Twitter’s safety team said late Tuesday in a tweet. “In line with this approach, this week we are taking further action on so-called ‘QAnon’ activity across the service.” More than 7,000 accounts have been removed in the last several weeks, according to Twitter. It also expects that additional actions it is taking to limit the reach of QAnon activity on its platform could affect 150,000 accounts worldwide. QAnon began as a single conspiracy theory. But its followers now act more like a virtual cult, largely adoring and believing whatever disinformation the conspiracy community spins up. Its main conspiracy theories claim dozens of politicians and A-list celebrities work in tandem with governments around the globe to engage in child sex abuse. Followers also believe there is a “deep state” effort to annihilate President Donald Trump. “We will permanently suspend accounts Tweeting about these topics that we know are engaged in violations of our multi-account policy, coordinating abuse around individual victims, or are attempting to evade a previous suspension — something we’ve seen more of in recent weeks,” Twitter said. Twitter’s multi-account policy prohibits coordinating with others to artificially engage or amplify conversations. There’s no evidence that any of what QAnon claims is factual. Followers make unfounded claims and then amplify them with doctored or out-of-context evidence posted on social media to support the allegations. The anarchical group’s birth, and its continued seepage into mainstream American life, comes on the coattails of the Russian disinformation campaign that targeted US elections in 2016. While the Russian campaign had an apparent objective — influence voters to elect Trump — QAnon is decentralized, having no clear objective aside from its popular slogan, “Question everything.” Anyone can create a conspiracy, offer evidence to support it and tag it with QAnon hashtags to spread it. But no one is held responsible for the trail of chaos and disinformation it leaves behind. Twitter said it will also no longer serve content associated with QAnon in its Trends section and recommendations, prevent it from being highlighted in searches and block URLs associated with QAnon from being shared on Twitter. “These actions will be rolled out comprehensively this week,” the company said. “We will continue to review this activity across our service and update our rules and enforcement approach again if necessary.” At least three GOP candidates have been sympathetic or supportive of QAnon: Jo Rae Perkins, a candidate for a US Senate seat in Oregon; Marjorie Taylor Greene, a Congressional candidate for Georgia’s 14th district seat; and Lauren Boebert, who beat a Trump-backed, five-term incumbent during primary elections to become a candidate for Colorado’s 3rd district. Twitter has been taking more aggressive action against disinformation on its platform in recent months, placing warning labels on a post by President Trump about mail-in ballots and another during a protest where he said “looting” would lead to “shooting.” Facebook, which came under fire for not taking action against those posts, began adding labels to some of Trump’s more recent posts. But rather than attempt to fact-check the posts as true or false concerning claims about mail-in voting, the labels direct users to a government website to learn more about how to vote. And while Twitter has implemented several policy changes to crackdown on misinformation, enforcing them could be a challenge and only have a limited impact on preventing the group’s conspiracy theories from spreading. QAnon followers are active on Facebook, Reddit, YouTube and other darker corners of the internet. Three sources familiar with Facebook’s work on misinformation told CNN Wednesday the company is planning to take action against QAnon. One source said the company has been studying QAnon for some time and assessing how its existing policies would apply to QAnon. The company recently took action against the Boogaloo extremist movement and some of that work overlapped with its work on assessing a plan to cover QAnon, the source said Another source said Facebook’s challenge is determining what actions the company could take that would be effective and not be easily evaded. Followers of groups that face crackdowns from social media companies are known to use evolving terminology and iconography designed to skirt rules. — CNN’s Paul P. Murphy, Donie O’ Sullivan, Marshall Cohen and Michelle Toh contributed to this report.