A woman participates in a pro-Trump election integrity rally at the Orange County Registrar of Voters offices in Santa Ana, Calif., Monday, Nov. 9, 2020. (Paul Bersebach/The Orange County Register via AP)
Here’s what happens if you automate your posting election disinformation on YouTube right now: The video will not be taken down, even if it includes multiple false claims. A small fact-checking label may be applied and the video probably won’t be promoted in YouTube’s recommendations or search results. But, the video will remain on the platform and it can still be monetized.
That’s paraphrasing the real policies somewhat, but essentially this is how YouTube’s light-touch approach to moderation works, in the wake of the most hotly-disputed election results in U.S. history.
The result is that election misinformation is rampant on YouTube, with videos spreading completely baseless claims about fake election results racking up millions of views and earning YouTube and the content creators money from ads and merchandising.
YouTube claims that by not promoting the videos in search results or through its recommendation algorithm, it limits the videos’ reach.
But, right-wing channels seeking to promote unfounded claims of election fraud and vote-rigging have short-circuited YouTube’s efforts, utilizing other social networks to create a feedback loop that actually helps spread the misinformation even further.
Here’s how that works:
First, a baseless claim is shared on Twitter by an account known for sharing conspiracy theories. In this case, it was the claim that RealClearPolitics had rescinded calling Pennsylvania for Joe Biden (in reality RCP had never called Pennsylvania for Biden in the first place).
Next, a high-profile right-wing figure picks up the claim. In this case, Rudy Guiliani posted on Monday evening that “Real Clear Politics just took PA away from Biden and made it a toss-up.”
Then, three hours later, a right-wing YouTube channel published a video centered around that claim.
Within hours the video was racking up hundreds of thousands of views, even though YouTube’s algorithm is not recommending the video or surfacing it through search results. This is thanks to the video being shared widely on Facebook Groups, most of them private, who help boost the baseless claims.
By Wednesday morning, the video had racked up almost 1.8 million views. The video is so popular that not only is the channel making money from ads, it has also begun selling merch under the video — something YouTube takes a cut from too.
Not only has YouTube not taken the video down or demonetized it, it has even failed to apply the perfunctory label telling viewers that the election has already been called for Joe Biden.
The video was created by Gary S. Franchi, host of the New News Network and a known conspiracy theorist. Franchi was listed by the Southern Poverty Law Center as someone spreading hate, after he helped promote the false claim that the U.S. was building concentration camps for Americans who disagreed with the government. And yet, last year YouTube gave Franchi a gold Creator Award.
But it is not only conspiracy theorists who are spreading this type of misinformation on YouTube.
The president’s own official channel posted a video last week entitled “If you count the legal votes, I easily win.” On Tuesday, right-wing news outlet One America Network posted a video titled “Trump won.”
While YouTube has labeled that video, it remains on the platform, has already racked up 300,000 views, and continues to be monetized.
These are among dozens of videos that either falsely claim Trump has won the election or that Biden has stolen it.
YouTube didn’t respond to a VICE News request for comment, but in a statement to the New York Times, YouTube defended its position by saying that “the majority of election-related searches are surfacing results from authoritative sources, and we’re reducing the spread of harmful elections-related misinformation.”
The path taken by the particular lies about RealClearPolitics, moving from Twitter to YouTube and onto Facebook and back to YouTube again, shows how ineffective YouTube’s policy is.
“Like other companies, we are allowing discussions of the election results and the process of counting votes and are continuing to closely monitor new developments,” a YouTube spokesperson added.
But by allowing misinformation to spread so easily on its platform, YouTube is endangering its users at a time when a quarter of all Americans rely on YouTube as a primary source of news.