Almost three years after a conspiracy theory known as QAnon began spreading on dark corners of the internet, pushing unfounded claims about a so-called “deep state” and prompting concerns about targeted harassment, social media platforms are finally looking to do something about it. Twitter\n \n (TWTR) announced late Tuesday that it had removed thousands of accounts linked to QAnon, citing its policy of taking action on accounts with “the potential to lead to offline harm.” Twitter\n \n (TWTR) said it would “permanently suspend accounts Tweeting about these topics” and “coordinating abuse around individual victims.” Facebook\n \n (FB) is also planning to take action against QAnon, according to three sources familiar with the company’s work on misinformation. One source said the company has been studying QAnon for some time and assessing how its existing policies would apply to QAnon. While these actions may be welcome news for people who have been harassed for years by followers of the conspiracy theory, which in its adherence to dogma even when it is obviously false or contradicts itself can resemble a cult, these policy moves nonetheless raise the question of why it took the platforms so long to act. As November’s election approaches, technology companies are announcing more policies to try to crack down on the spread of misinformation. Whether the companies will successfully and effectively enact these policies will be subject to scrutiny. CNN Business has also reached out to YouTube, where QAnon-pushing videos have also proliferated, to ask if it is going to do anything to crack down on QAnon content beyond its existing policies. In the years since QAnon began on the hate-filled message board 4chan, it has taken on a life of its own. What was once an unfounded conspiracy theory about a cabal of pedophiles and a government “deep state” has become a fringe movement – a catchall for proponents of all shades of misinformation. Multiple Republican candidates running in November’s elections have embraced QAnon. President Donald Trump has retweeted QAnon accounts and his rallies are often sprinkled with attendees with QAnon-related signs and shirts. Michael Flynn, Trump’s first national security adviser, posted a video earlier this month in which he used phrases and slogans that are hallmarks of the QAnon movement. “Disinformation and dangerous conspiracy theories have been a major issue on social media for years, and social platforms have done substantial harm by escalating divisions in our society and radicalizing people towards conspiracies and hate,” Johnathan Greenblatt, the CEO of the Anti-Defamation League, told CNN Business on Wednesday in response to Twitter’s QAnon takedown. “Still, it’s never too late to do the right thing.” In announcing its crackdown, Twitter also confirmed something troubling: The social network’s algorithms have been recommending and highlighting QAnon content to Twitter users. This implies Twitter itself, and not just the people on it, played a role in amplifying the conspiracy theory. Facebook, meanwhile, has many groups devoted to the conspiracy, some of which are listed as having tens of thousands of members. Facebook groups are rabbit-holes, ideological echo-chambers where biases and beliefs, accurate or not, are reinforced. It is unclear whether all subscribers of QAnon believe or even know about all the absurd claims tied into the conspiracy theory or raised by its other supporters. Indeed, support for QAnon generally has become a badge of anti-establishment thinking on the fringes. But the rise of QAnon highlights how easily the lines between the fringe and the mainstream can be blurred when conspiracy theories are allowed to run rampant on some of the most powerful online platforms in the world, reaching an untold number of people. In perhaps the clearest sign of how influential this conspiracy theory has become, even as Twitter took down thousands of QAnon accounts, it left up accounts belonging to QAnon supporters now running for Congress. Twitter has maintained elsewhere that it does not want to silence candidates — the company largely believes the public should be able to judge candidates for themselves, warts and all. In short, QAnon may already have reached a level of prominence that makes it hard, if not impossible, to shut down completely.