- YouTube announced in a blog post Thursday morning that it would begin banning QAnon conspiracy theory content that harasses or threatens others.
- The company said it had already removed thousands of QAnon videos and taken down hundreds of channels, but that the new measures would lead to even more moderation of the conspiracy theory
- The announcement, which stops short of an explicit QAnon ban, comes just days after YouTube CEO Susan Wojcicki failed to commit to a QAnon ban on CNN following a wave of actions against the conspiracy theory by other tech companies.
- Visit Business Insider’s homepage for more stories.
YouTube announced in a blog post Thursday morning that it would prohibit content that uses the QAnon conspiracy theory to target or harass an individual or group.
The move — which is not a ban on QAnon — is part of an expansion of YouTube’s hate and harassment policies that now prohibit content promoting “conspiracy theories that have been used to justify real-world violence.”
The company said it had already removed “tens of thousands” of QAnon-related videos and taken down hundreds of channels in accordance with the platform’s “existing policies” against hate and harassment, but that the new measures would lead to the prohibition and removal of more videos.
“We will begin enforcing this updated policy today, and will ramp up in the weeks to come,” the company said on Thursday. “Due to the evolving nature and shifting tactics of groups promoting these conspiracy theories, we’ll continue to adapt our policies to stay current and remain committed to taking the steps needed to live up to this responsibility.”
Several crimes have been linked to QAnon, the baseless far-right conspiracy theory that alleges President Donald Trump is fighting a deep-state cabal of human traffickers. In 2019, the FBI warned in a bulletin that QAnon could become a domestic terrorism threat. In the year since that statement, the movement has been linked to additional criminal activities, including attempted kidnappings.
In the past, the company has drawn ire for its infamous video-recommendation algorithm, which has been known to lead people down rabbit holes of disinformation and radicalization.
In Thursday’s blog post, the company said that it had been working on limiting “the reach of harmful misinformation” for close to two years. YouTube said that “the number of views that come from non-subscribed recommendations to prominent Q-related channels” has fallen by more than 80% since January 2019.
YouTube’s post specified that the platform would allow news coverage or videos debunking QAnon. Insider previously reported that creators debunking misinformation had been wrongfully hit with strikes from the platform’s automated moderation system.
The announcement comes just days after CEO Susan Wojcicki drew criticism for failing to take a strong stance against QAnon in an interview with CNN. “We’re looking very closely at QAnon,” Wojcicki said, but did not say whether the platform would ban the conspiracy theory movement.
Facebook, which has also been criticized for allowing QAnon to flourish and spread in the three years since its inception, announced on October 6 that all pages, groups, and Instagram accounts associated with QAnon were not permitted on the platforms and would be removed.
Powered by WPeMatico