News

The Digital Services Act could require big changes to digital platforms. Here are 4 things lawmakers need to know to protect people-powered spaces like Wikipedia.

European Union Flags

The European Parliament and Council are debating amendments to the draft proposal for the Digital Services Act (DSA), one of several recent regulatory policies developed to hold large tech platforms accountable for illegal content that spreads on their sites. 

Update: The European Parliament’s Internal Market Committee voted on key provisions of the Digital Services Act (DSA) on 13 December 2021. For an overview of the vote and what this means for the future of the DSA, read our post on Medium.

The Wikimedia Foundation, the nonprofit that operates Wikipedia, applauds European policymakers’ efforts to make content moderation more accountable and transparent. However, some of the DSA’s current provisions and proposed amendments also include requirements that could put Wikipedia’s collaborative and not-for-profit model at risk

Wikipedia’s system of open collaboration has enabled knowledge-sharing on a global scale for more than 20 years. It is one of the most beloved websites in the world, as well as one of the most trusted sources for up-to-date knowledge about COVID-19. All of this is only made possible by laws that protect its volunteer-led model. But now, that people-powered model is getting caught in the cross-fires of the DSA proposals.

The current DSA framework is designed to address the operating models of major tech platforms. But a variety of websites, Wikipedia included, don’t work in the same way that for-profit tech platforms do. Applying a one-size-fits all solution to the complex problem of illegal content online could stifle a diverse, thriving, and noncommercial ecosystem of online communities and platforms. 

We are calling on European lawmakers to take a more nuanced approach to internet regulation. There is more to the internet than Big Tech platforms run by multinational corporations. We ask lawmakers to protect and support nonprofit, community-governed, public interest projects like Wikipedia as the DSA proceeds through the European Parliament and Council.

We are ready to work with lawmakers to amend the DSA package so that it empowers and protects the ability of all Europeans to collaborate in the public interest. 

Protect Wikipedia, protect the people’s internet. 

Here are four things policymakers should know before finalizing the DSA legislation: 

  1. The DSA needs to address the algorithmic systems and business models that drive the harms caused by illegal content. 

DSA provisions remain overly-focused on removing content through prescriptive content removal processes. The reality is that removing all illegal content from the internet as soon as it appears is as daunting as any effort to prevent and eliminate all crime in the physical world. Given that the European Union is committed to protecting human rights online and offline, lawmakers should focus on the primary cause of widespread harm online: systems that amplify and spread illegal content. 

A safer internet is only possible if DSA provisions address the targeted advertising business model that drives the spread of illegal content. As the Facebook whistleblower Frances Haugen emphasized in her recent testimony in Brussels, the algorithms driving profits for ad-placements are also at the root of the problem that the DSA is seeking to address. New regulation should focus on these mechanisms that maximize the reach and impact of illegal content. 

But lawmakers should not be overly focused on Facebook and similar platforms. As a non-profit website, Wikipedia is available for free to everyone, without ads, and without tracking reader behavior. Our volunteer-led, collaborative model of content production and governance helps ensure that content on Wikipedia is neutral and reliable. Thousands of editors deliberate, debate, and work together to decide what information gets included and how it is presented. This works very differently than the centralized systems that lean on algorithms to both share information in a way that maximizes engagement, and to moderate potentially illegal or harmful content.  

In Wikipedia’s 20 years, our global community of volunteers has proven that empowering users to share and debate facts is a powerful means to combat the use of the internet by hoaxers, foreign influence operators, and extremists. It is imperative that new legislation like the DSA fosters space for a variety of web platforms, commercial and noncommercial, to thrive.

“Wikipedia has shown that it is possible to create healthy online environments that are resilient against disinformation and manipulation. Through nuance and context, Wikipedia offers a model that works well to address the intricacies required in content moderation. Yes, there might be disagreement amongst volunteers on how to present a topic, but that discussion yields better, more neutral, and reliable articles. This process is what has enabled it to be one of the most successful content moderation models in this day and age.”

Brit Stakston, media strategist and Board member of Wikimedia Sverige
  1. Terms of service should be transparent and equitable, but regulators should not be overly-prescriptive in determining how they are created and enforced. 

The draft DSA’s Article 12 currently states that an online provider has to disclose its terms of service—its rules and tools for content moderation— and that they must be enforced “in a diligent, objective, and proportionate manner.” We agree that terms of service should be as transparent and equitable as possible. However, the words “objective” and “proportionate” leave room for an open, vague interpretation. We sympathize with the intent, which is to make companies’ content moderation processes less arbitrary and opaque. But forcing platforms to be “objective” about terms of service violations would have unintended consequences. Such language could potentially lead to enforcement that would make it impossible for community-governed platforms like Wikipedia to use volunteer-driven, collaborative processes to create new rules and enforce existing ones that take context and origin of all content appropriately into account. 

The policies for content and conduct on Wikipedia are developed and enforced by the people contributing to Wikipedia themselves. This model allows people who know about a topic to determine what content should exist on the site and how that content should be maintained, based on established neutrality and reliable sourcing rules. This model, while imperfect, keeps Wikipedia neutral and reliable. As more people engage in the editorial process of debating, fact-checking, and adding information, Wikipedia articles tend to become more neutral. What’s more, volunteers’ deliberation, decisions, and enforcement actions are publicly documented on the website.  

This approach to content creation and governance is a far-cry from the top-down power structure of the commercial platforms that DSA provisions target. The DSA should protect and promote spaces on the web that allow for open collaboration instead of forcing Wikipedia to conform to a top-down model.  

  1. The process for identifying and removing “illegal content” must include user communities.

Article 14 states that online platforms will be responsible for removing any illegal content that might be uploaded by users, once the platforms have been notified of that illegal content. It also states that platforms will be responsible for creating mechanisms that make it possible for users to alert platform providers of illegal content. These provisions tend to only speak to one type of platform: those with centralized content moderation systems, where users have limited ability to participate in decisions over content, and moderation instead tends to fall on a singular body run by the platform. It is unclear how platforms that fall outside this archetype will be affected by the final versions of these provisions. 

The Wikipedia model empowers the volunteers who edit Wikipedia to remove content according to a mutually-agreed upon set of shared standards. Thus while the Wikimedia Foundation handles some requests to evaluate illegal content, the vast majority of content that does not meet Wikipedia’s standards is handled by volunteers before a complaint is even made to the Foundation. One size simply does not fit all in this case.

We fear that by placing legal responsibility for enforcement solely on service providers and requiring them to uphold strict standards for content removal, the law disincentivizes systems which rely on community moderators and deliberative processes. In fact, these processes have been shown to work well to identify and quickly remove bad content. The result would be an online world in which service providers, not people, control what information is available online. We are concerned that this provision will do the exact opposite of what the DSA intends by giving more power to platforms, and less to people who use them. 

  1. People cannot be replaced with algorithms when it comes to moderating content. 

The best parts of the internet are powered by people, not in spite of them. Article 12 and 14 would require platform operators to seize control of all decisions about content moderation, which would in turn incentivize or even require the use of automatic content detection systems. While such systems can support community-led content moderation by flagging content for review, they cannot replace humans. If anything, research has uncovered systemic biases and high error-rates that are all-too-frequently associated with the use of automated tools. Such algorithms can thus further compound the harm posed by amplification. Automated tools are limited in their ability to identify fringe content that may be extreme but still has public interest value. One example of such content are videos documenting human rights abuses, which have been demonstrated to be swiftly removed.  These examples only underscore the need to prioritize human context over speed.

Therefore, European lawmakers should avoid over-reliance on the kind of algorithms used by commercial platforms to moderate content. If the DSA forces or incentivizes platforms to deploy algorithms to make judgements about the value or infringing nature of content, we all – as digital citizenry – miss out on the opportunity to shape our digital future together. 

On Wikipedia, machine learning tools are used as an aid, not a replacement for human-led content moderation. These tools operate transparently on Wikipedia, and volunteers have the final say in what actions machine learning tools might suggest. As we have seen, putting more decision-making power into the hands of Wikipedia readers and editors makes the site more robust and reliable. 

“It is impossible to trust a ‘perfect algorithm’ to moderate content online. There will always be errors, by malicious intent or otherwise. Wikipedia is successful because it does not follow a predefined model; rather, it relies on the discussions and consensus of humans instead of algorithms.”

Maurizio Codogno, longtime Italian Wikipedia volunteer 

We urge policymakers to think about how new rules can help reshape our digital spaces so that collaborative platforms like ours are no longer the exception. Regulation should empower people to take control of their digital public spaces, instead of confining them to act as passive receivers of content moderation practices. We need policy and legal frameworks that enable and empower citizens to shape the internet’s future, rather than forcing platforms to exclude them further. 

Our public interest community is here to engage with lawmakers to help design regulations that empower citizens to improve our online spaces together. 

“Humanity’s knowledge is, more often than not, still inaccessible to many: whether it’s stored in private archives, hidden in little-known databases, or lost in the memories of our elders. Wikipedia aims to improve the dissemination of knowledge by digitizing our heritage and sharing it freely for everyone online. The COVID-19 pandemic and subsequent infodemic only further remind us of the importance of spreading free knowledge.”

Pierre-Yves Beaudouin, President, Wikimedia France

How to get in touch with Wikimedia’s policy experts 

Read more Advocacy

Help us unlock the world’s knowledge.

As a nonprofit, Wikipedia and our related free knowledge projects are powered primarily through donations.

Donate now

Contact us

Questions about the Wikimedia Foundation or our projects? Get in touch with our team.
Contact

Photo credits