YouTube will now allow 2020 election denialism content, in policy reversal

CNN/Stylemagazine.com Newswire | 6/2/2023, 2:20 p.m.
YouTube on Friday said it will no longer remove content featuring false claims that the 2020 US presidential election was …
On June 2, YouTube said it will no longer remove content that features false claims that the 2020 US presidential election was stolen, reversing a policy instituted two years ago amid a wave of false claims about the election. Mandatory Credit: Adobe Stock

Originally Published: 02 JUN 23 14:27 ET

Updated: 02 JUN 23 15:03 ET

By Clare Duffy, CNN

New York (CNN) — YouTube on Friday said it will no longer remove content featuring false claims that the 2020 US presidential election was stolen, reversing a policy instituted more than two years ago amid a wave of misinformation about the election.

The platform said in a blog post that it will stop removing “content that advances false claims that widespread fraud, errors, or glitches occurred in the 2020 and other past US Presidential elections.”

“In the current environment, we find that while removing this content does curb some misinformation, it could also have the unintended effect of curtailing political speech without meaningfully reducing the risk of violence or other real-world harm,” YouTube said in the post, which was first reported by Axios. The Google-owned platform said that it had removed “tens of thousands” of videos under the old policy.

The announcement comes in the run-up to the 2024 US presidential election and at a time when former President Donald Trump, the current frontrunner in the Republican primary, continues to make baseless claims about the previous election.

Other elements of the platform’s election misinformation policies remain in place, YouTube said, including prohibitions against content that could mislead users about how and when to vote, false claims that could discourage voting and content that “encourages others to interfere with democratic processes.”

YouTube announced in December 2020 that it would remove misleading videos claiming that widespread fraud or other errors changed the outcome of the US presidential election, following the election’s safe harbor deadline.

Pressure on tech companies to combat election misinformation ramped up following the January 6, 2021, attack on the US Capitol, which was fueled by baseless claims about election fraud.

Ahead of the 2022 midterm elections, YouTube said it had begun removing midterm-related videos that made false claims about the 2020 election in violation of its policies.

“This includes videos that violated our election integrity policy by claiming widespread fraud, errors, or glitches occurred in the 2020 U.S. presidential election, or alleging the election was stolen or rigged,” YouTube said at the time. The policy went further than those instituted by competitors Twitter and Meta, which both stopped short of banning content questioning the outcome of the election.

YouTube said on Friday that it will continue to elevate election information from authoritative sources such as news outlets on the platform, and that it will share more about its approach to the 2024 election in the coming months.