YouTube won't ban QAnon content, but will remove videos that may promote violence
YouTube is the latest company in Silicon Valley to update its moderation policies around QAnon Marginal Theory, announcing that content targeting or harassing people based on conspiracy theories will be removed. However, YouTube will not issue a blanket ban on QAnon content.
The company is trying to reduce harassment and hate by "removing more conspiracy theory content used to justify real-world violence ", according to its new blog post . This means if people post videos on QAnon and allege anything that could cause harm or actual harassment for a specific person or group, these videos will be removed. The YouTube blog post did not indicate whether these composts would or may not be deleted, although YouTube tends to enforce a three-warning policy before a channel is deleted.
"As always, context is important, so news coverage on these issues or content by discussing without targeting individuals or protected groups can remain active ", reads the blog post. "We will begin to apply this updated policy today and will do so in the weeks to come."
"There is still more we can do for some conspiracy theories which are used to justify real-world violence, like QAnon "
YouTube has spent some time updating its policies over the past two years to target hateful videos, some of which includeconspiracy theory videos, according to the blog. Policies are supposed to limit algorithmic recommendations of these types of videos, and the number of views of QAnon content from non-subscriber recommendations has fallen by more than 80% since January 2019, the company says.
While YouTube is the latest company to take an additional stand against QAnon conspiracy theories, other social platforms are also starting to take firmer positions. Facebook banned content related to QAnon last week, although posts from individual accounts are still acceptable. This was the biggest step Facebook has taken in its ongoing fight against the spread of disinformation on the platform. Pinterest also reiterated its policy to Insider on the ban on con held QAnon, who has been in place since 2018, a spokesperson told The Verge. Peloton has also removed hashtags related to conspiracy theory.
The YouTube blog adds that it "deleted tens of thousands of QAnon videos and terminated hundreds of channels " since entering updated policy force. The company calls the work a "pivot to limit the scope of harmful conspiracies," but recognizes that more needs to be done.
"We can do even more to combat certain conspiracy theories that are used to justify violence in the real world, like QAnon. "
Updated October 16, 12:15 p.m. ET: The story has been updated to include additional context from Pinterest.