News

What Wikipedia saw during election week in the U.S., and what we’re doing next

Vote sign for US elections

Election Day in the United States was a critical moment for the country, with impacts that will extend well beyond one election cycle. For many Americans, it was an anxiety-inducing event. While voters waited — and waited — for the results to come in, Wikipedia editors across the globe stood ready.

As one of the world’s most trusted resources for knowledge, it’s essential that Wikipedia provide its users with reliable information. In 2020, a high-stakes election and a deadly pandemic were just two of the many reasons that make that mission more important than ever.

That’s why the Wikimedia Foundation took significant steps to protect Wikipedia from election-related disinformation. For the first time, a disinformation task force worked closely with Wikipedia’s volunteer editors to identify potential information attacks targeting the integrity of the election before they could spread.

Wikipedia’s biggest worry wasn’t vandalism — insults or pranks directed at candidates or biased campaign editing, as those types of changes are typically caught and reverted quickly. We were more concerned about the sort of activity that would disrupt the elections — voter suppression tactics affecting information about polling station locations or other topics that could undermine confidence in the facts.

In the end, Wikipedia dealt with only a small number of events relating to election influence activities; neither the Foundation’s task force members or Wikipedia’s admins saw evidence of large-scale state-sponsored disinformation.

  • Overall, Wikipedia protected about 2,000 election-related pages. Restrictions were put in place so that many of the most important election-related pages, such as the main page about the U.S. 2020 Presidential Election, could be  edited only by the most trusted and experienced Wikipedia editors.
  • More than 56,000 volunteer editors monitored the protected pages via real-time feeds of pages they “watch” for new edits. Those editors were distributed across the globe. Someone was always vigilant, no matter the hour.
  • The Wikimedia Foundation’s disinformation task force recorded and evaluated 18 events. As always, they worked closely with volunteers, who lead the process of editing and evaluating. All of those edits were quickly reverted by Wikipedia’s community.
  • Nearly 800 edits to election-related Wikipedia pages were reverted by the community between November 3 and November 7.
  • The main U.S. Election article saw just 33 reversions during the same time frame — a testament to the community’s preparedness and the defenses Wikipedia editors put in place.

Wikipedia’s editorial standards played a major role in keeping the platform free of disinformation during the U.S. elections. Editors draw from accurate and verifiable sources — not the latest breaking news, or statements on social media. And they collaborate so that information on Wikipedia reflects multiple editors’ areas of expertise.

For instance, the community kept a close eye on the Wikipedia entry for Benford’s Law, a statistical theory that was used to drive false allegations of voter fraud. Wikipedia’s community of mathematicians coordinated with political editors to make sure the Benford’s Law article wasn’t used to drive disinformation that would have undermined confidence in the election results.

This sort of interdisciplinary collaboration is possible because of Wikipedia’s uniquely collective nature. Users see only the latest versions of articles, and they can investigate how pages have changed over time. That transparency and consistency makes Wikipedia special — there are no different timelines or feeds here. Ads and algorithms don’t influence what users see, either.

The U.S. elections may be over, but the work doesn’t end here. In the coming weeks, our task force will conduct a deeper analysis with community editors to learn more about what worked well and what didn’t, to inform practices for similar events in the future.

The solutions are not simple — they’ll require an approach that considers the entire ecosystem of knowledge — from education, to journalistic practice, to platform response. We’re committed to doing our part to protect the integrity of information on the Wikimedia projects, and to support communities everywhere who want to share in the sum of all knowledge.

To help meet this goal, we hope to invest in resources that we can share with international Wikipedia communities that will help mitigate future disinformation risks on the sites. We’re also looking to bring together administrators from different language Wikipedias for a global forum on disinformation. Together, we aim to build more tools to support our volunteer editors, and to combat disinformation.

As always, convening and supporting the global Wikimedia movement will be at the heart of how we work. Together with editing communities, we’ll be looking to develop and refine data-driven tools to support the identification and response to disinformation.

Ryan Merkley (@ryanmerkley) is Chief of Staff at the Wikimedia Foundation.

Read more Wikipedia, elections

Related

Read further in the pursuit of knowledge

Ceiling of Cincinnati Union Terminal featuring a US flag

How Wikipedia Is Preparing For The 2020 U.S. Election

If the internet is the most important battleground in next week’s U.S. presidential election, then Wikipedia is the Web’s neutral zone. Last month, U.S. federal agencies issued a public service announcementwith a warning that bad actors could use the internet to spread disinformation in an effort to discredit the legitimacy of the voting process. As….

Read more

Help us unlock the world’s knowledge.

As a nonprofit, Wikipedia and our related free knowledge projects are powered primarily through donations.

Donate now

Contact us

Questions about the Wikimedia Foundation or our projects? Get in touch with our team.
Contact

Photo credits