As Twitter and Facebook scrambled to institute new policies for the 2020 election, YouTube was… mostly quiet. The platform didn’t make any flashy announcements about a crackdown on election-related misinformation, nor did it really fully grapple with its massive role in distributing information during what was widely regarded as an extremely volatile time for American democracy.
Former Vice President Joe Biden won the presidential election on November 7, but YouTube decided to wait until the “safe harbor” deadline, when audits and recounts must be wrapped up at the state level, to enforce a set of rules against election misinformation.
In a new blog post out Wednesday, the world’s second-biggest social network explained itself — sort of:
Yesterday was the safe harbor deadline for the U.S. Presidential election and enough states have certified their election results to determine a President-elect. Given that, we will start removing any piece of content uploaded today (or anytime after) that misleads people by alleging that widespread fraud or errors changed the outcome of the 2020 U.S. Presidential election, in line with our approach towards historical U.S. Presidential elections. For example, we will remove videos claiming that a Presidential candidate won the election due to widespread software glitches or counting errors. We will begin enforcing this policy today, and will ramp up in the weeks to come.
YouTube clarified that while its users were allowed to spread misinformation about an undecided election, content claiming that “widespread fraud or errors” influenced the result of a past election will not be allowed. And from YouTube’s perspective, which accommodated the Trump administration’s many empty challenges to the results, the election was only decided yesterday.
The four days between November 3 and November 7 were fraught, plagued by false victory claims from President Trump and his supporters and concerns about political violence as online misinformation, already a pervasive threat, kicked into overdrive. Rather than wading into all that as Twitter and even the ever-reluctant-to-act Facebook did, YouTube mostly opted to sit back and wait for history to take its course. The company was more comfortable universally pointing users toward real information than making tough calls and actively purging false claims from its platform.
YouTube doesn’t go to great lengths to explain itself these days, much less make real-time platform policy decisions in a transparent way. Twitter has pioneered that approach, and while its choices aren’t always clear or decisive, its transparency and open communication is admirable. If Twitter doesn’t always get it right, YouTube fails to even step up to the plate, making few real efforts to adapt to the rapidly mutating threats posed by misinformation online.
YouTube’s opaque decision-making process is compounded by the also opaque nature of online video, which is vastly more difficult for journalists to search and index than text-based platforms. The result is that YouTube had largely gotten away with relatively little scrutiny compared to its stature in the social media world. It’s bizarre to see Mark Zuckerberg and Jack Dorsey called before the Senate Judiciary Committee without even a passing thought to bringing in YouTube CEO Susan Wojcicki as well. In spite of its massive influence and two billion users, the social video behemoth is barely on the radar for lawmakers.
If YouTube’s strategy is that communicating less attracts less attention, unfortunately it appears to be working. The company is bound to be anxious about getting dragged into federal and state-level antitrust investigations, particularly with state lawsuits that could try to force Facebook and Instagram apart.
The Justice Department is already targeting Google with a historic antitrust suit focused on its search business, but that doesn’t preclude other antitrust actions from taking aim at YouTube. Keeping its head down may have worked for YouTube during four years of Trump, but President-elect Biden is more interested in inoculating people against misinformation rather than super-spreading it.