17.4 C
California
Thursday, October 22, 2020

YouTube tightens rules on conspiracy videos, but stops short of banning QAnon

YouTube CEO Susan Wojcicki speaks during the opening keynote address at the Google I/O 2017 Conference at Shoreline Amphitheater on May 17, 2017 in Mountain View, California.

Justin Sullivan | Getty Images

Google‘s YouTube is updating its hate speech policy to ban videos that target individuals or groups with conspiracy theories that have been used to incite violence, such as QAnon.

“Today we’re further expanding both our hate and harassment policies to prohibit content that targets an individual or group with conspiracy theories that have been used to justify real-world violence,” a spokesperson said in a statement to CNBC. “One example would be content that threatens or harasses someone by suggesting they are complicit in one of these harmful conspiracies, such as QAnon or Pizzagate.”

For example, YouTube would take down videos falsely accusing Democratic presidential candidate Joe Biden of being involved in pedophilia, a baseless claim that’s related to the QAnon theory and that’s been promoted by prominent Republican leaders, including Donald Trump Jr.

The change stops short of a total ban on QAnon, which echoes YouTube CEO Susan Wojcicki’s stance in a recent CNN interview. YouTube told CNBC that it doesn’t consider QAnon a monolithic entity that can be banned — but one that has sprawling tentacles sometimes interwoven with truth, gray areas or under a pseudonym.

Called a potential source of domestic terrorism by the FBI, QAnon is a baseless conspiracy theory that claims President Donald Trump is engaged in a secret battle to stop a global pedophile ring involving many prominent celebrities and Democrats. Recent QAnon posts have also spread false information about voting and about Covid-19, even sparking claims that the president faked his diagnosis of Covid-19 in order to orchestrate secret arrests. 

The policy update comes as social media companies face pressure to contain misinformation — especially those catapulted by the rise of group QAnon ahead of the U.S. elections and during Covid-19. Facebook earlier this week said it would ban all QAnon groups as dangerous, and Twitter has also cracked down on QAnon-related content.

YouTube added that it has removed tens of thousands of QAnon videos and terminated hundreds of channels under existing policies, including those that threaten violence.

Earlier this year, the company updated its “harmful and dangerous” policy to begin removing content that contains Covid-19 misinformation, such as claims that 5G causes the coronavirus or that masks “activate” the virus.

WATCH NOW: Facebook cracks down on QAnon across all platforms

Speak Your Mind

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Get in Touch

211FansLike
38FollowersFollow
41FollowersFollow

Recommend for You

Facebook and Twitter CEOs will have to answer to Senate Republicans after Biden NY Post story controversy

Facebook and Twitter typically take heat for acting too slowly to reduce the spread of harmful misinformation. But on Wednesday, both companies acted surprisingly...

Sensitive to claims of bias, Facebook relaxed misinformation rules for conservative pages

Facebook has allowed conservative news outlets and personalities to repeatedly spread false information without facing any of the company's stated penalties, according to leaked...

Investment in space companies bounced back in the third quarter from the COVID-19 lull, report says

A Falcon rocket launches a Starlink mission in October 2020.SpaceXPrivate investment into space companies in the third quarter of 2020 bounced back after a...

Related Articles