The increasingly visible and vocal followers of QAnon promote a bewildering blend of unsubstantiated conspiracy theories, worrying everyone from Facebook to the FBI. Once on the fringes of the internet and focused on US politics, the movement has seen sharp growth on mainstream social media platforms this year, prompting tech firms to tighten controls and ban QAnon followers. The movement is centred on the unsubstantiated belief that the world is run by a cabal of Satan-worshipping paedophiles. It has extended that this year to allege, without proof, that the coronavirus is a conspiracy by that group to control people using vaccines and 5G. Researchers detected sharp spikes in QAnon content and related searches in March, when many countries had started imposing lockdowns and other social distancing measures. The anxiety, frustration and economic pain caused by the pandemic – coupled with the increased amount of time people were spending online – became an explosive mix that drew people to QAnon, experts say. "QAnon blamed these events on global elites while also increasing distrust in mainstream media, government and organisations such as the WHO," said Mackenzie Hart, a disinformation researcher at the London-based ISD think tank. Core QAnon beliefs were also coupled with anti-vaccine messaging and far-right campaigns, further expanding its following. Tech analysis have pointed to a feature at the core of most major social media platforms as a key driver of QAnon growth: the recommendation algorithm. Users who view, post or search for certain content are guided to what the platform's algorithm determines to be other content they may be interested in. Analysts have said this helped link existing conspiracy theories – such as those about vaccines and 5G – with QAnon. "They know this, especially the core of true believers, they are very good at leveraging the algorithmic... amplification techniques to drive engagement to their videos or posts," said Alex Newhouse, a disinformation researcher at Middlebury College's Center on Terrorism, Extremism, and Counterterrorism. "QAnon would not exist in the volume that it exists without the recommendation algorithms on the big tech platforms." QAnon followers, in a bid to defeat the Satanist paedophile cabal, have also hijacked hashtags such as #SaveTheChildren, a move experts say has harmed serious efforts to stop human trafficking. Researchers have found Instagram influencers, including those that do not directly reference the movement, have used colourful and inviting visuals to promote QAnon conspiracy theories. Social media giants have been forced to act in recent months as QAnon content spread far and wide, with researchers finding material originating from around 70 countries. Hundreds of thousands of pages, groups, users and hashtags – including the QAnon slogan WWG1WGA, or "Where we go one, we go all" – have been removed, blocked or hidden on major social media. Platforms such as Facebook, Twitter, TikTok and YouTube have also ramped up surveillance for QAnon content, as adherents attempt to bypass the new filters. According to researchers, the filtering and blocking of some QAnon related content has had an impact, with a decrease in searches on Google and engagement with Facebook groups. But content remains rife. According to an NBC News report citing internal Facebook documents, QAnon groups and pages have millions of followers and members. Cutting off or reducing access to potential new followers is one of the few effective methods available to tech giants, experts say, but QAnon has amassed a huge hardcore following already. Many are already entrenched on the free-for-all fringes of the internet, such as anonymous message boards 4chan and 8kun, and encrypted communication apps like Telegram. "At this point, kicking off the true believers, banning them... is whack-a-mole. They're going to keep evading bans (on social media)," added Mr Newhouse. "It's going to be a lot harder task to tackle the now millions of true believers who are out there."