As required by the European Union’s Digital Services Act (“DSA”), the Wikimedia Foundation is publishing data on requests received between 1 January and 30 June 2024. In August 2023, the DSA’s main obligations began to apply to Wikipedia, four months after Wikipedia was designated as a Very Large Online Platform.
As per our consistent practice for transparency reporting, and the DSA’s requirements, this report only includes information about Foundation-taken moderation actions, not actions taken by the global community of volunteer contributors who create and curate the information on Wikimedia projects. On a project such as Wikipedia, the community conducts almost all the content moderation. Readers wishing to gain a broader appreciation of moderation activity across the projects more generally are invited to browse the many data interfaces available for this purpose, such as Wikistats, and the Logs function for each project (e.g. the Logs for English Wikipedia).
Regarding countries of origin
The Wikimedia Foundation is committed to protecting the privacy of our users, both readers and editors alike. As such, we deliberately collect very little data about our users, practicing the principles of data minimization. The right to privacy is accordingly embedded into our projects and is at the core of how communities contribute to Wikimedia projects. We protect users’ data and foster free knowledge by allowing users to partake in knowledge sharing activities across our platforms without providing any personal details, including their country of origin. In the vast majority of requests we receive to alter content or hand over user data, we do not know the country in which the requester is located or of which they hold citizenship. This report represents a best effort to report on the requests in which this information was obvious or could be reasonably deduced from the content of the request (such as the requester citing a country-specific law).
Regarding volume of requests
Many of the following sections of this report indicate no relevant requests during the covered period. The Wikimedia Foundation always receives a low volume of requests. Future reports will likely show low but non-zero numbers for many DSA-responsive categories.
Orders from EU Member States
During the covered period, the Foundation did not receive any formal orders from Member States’ authorities, either in the form of “Orders to act against illegal content” (DSA Article 9) or “Orders to provide information” (Article 10). The Foundation, however, received 4 non-DSA specific orders issued by Member States’ authorities which relate to the projects, all of which are pending and have not been resolved with finality to enable Foundation action that has the result of giving effect to the order.
Country | Category | Number |
---|---|---|
Italy | Visibility restriction | 1 |
France | Right to be forgotten | 1 |
Greece | Provide information in relation to Privacy claim | 1 |
France | Provide information in relation to Defamation claim | 1 |
Requests, including informal communications from EU-based government bodies or representatives, to provide user data or for alteration & takedown are found in the Transparency Report.
Notice and action submissions
During the covered period, the Foundation received the following notices containing allegations of illegal content on the projects. Notably, the Foundation did not receive notices from trusted flaggers.
For all Wikimedia projects, the community oversees and manages the bulk of content moderation activities. In many cases, the Foundation refers requests we receive to the community and defers to community decisions about project content. When the community asks the Foundation for input, we sometimes provide guidance on relevant laws or project policies. These notices are addressed by the community independently applying policies consistent with how the Wikimedia projects operate as volunteer-led. For this reason, there are only very few notices that are resolved through the Foundation’s intervention. In arriving at the median time, we consider when a notice was received through an official mechanism and the time it took to respond to the sender noting how the projects work and any possible next steps available.
Country | Alleged Claim | Notices |
---|---|---|
Austria | Right to be forgotten | 1 |
Bulgaria | Non-consensual image sharing | 1 |
Croatia | Right to be forgotten | 1 |
Czech Republic | Missing processing ground | 1 |
Estonia | Other (Data privacy and protection) | 1 |
Finland | Copyright infringement | 1 |
France | Missing processing ground Right to be forgotten Other (Data privacy and protection) Defamation Discrimination Other (Illegal or harmful speech) Copyright infringement Trademark infringement Online bullying | 1 2 1 3 2 1 2 1 1 |
Germany | Missing processing ground Right to be forgotten Discrimination Copyright infringement Trademark infringement Misinformation Violation of national law | 4 1 2 4 1 4 1 |
Greece | Right to be forgotten | 1 |
Italy | Right to be forgotten Defamation | 4 1 |
Netherlands | Right to be forgotten Defamation | 1 1 |
Poland | Other (Data privacy and protection) Defamation | 1 1 |
Portugal | Trademark infringement | 2 |
Spain | Other (Data privacy and protection) Defamation Trademark infringement Misinformation Non-consensual image sharing | 1 1 1 1 1 |
Sweden | Other (Data privacy and protection) | 2 |
Content moderation by the Wikimedia Foundation
Copyright
The Foundation publishes its Digital Millennium Copyright Act (“DMCA”) Policy to inform individuals or entities about the process by which a takedown notice can be submitted to report incidents of copyright infringement uploaded on the projects. Rightsholders are provided with an easy-to-understand process so they can submit their claims for evaluation.
The Foundation thoroughly evaluates each DMCA takedown request to ensure that it is valid. We only remove allegedly infringing content when we believe that a request is valid, and we are transparent about that removal. If we do not believe a request to be valid, we will push back as appropriate.
During the covered period of this report, the Foundation received 0 DMCA takedown notice from the EU.
Child safety actions
While child sexual abuse material (CSAM) has been found on Wikimedia projects, it is very rare. During the covered period of this report, the Wikimedia Foundation removed 145 files as actual or suspected CSAM through our standard reporting system.
No other content was removed by the Wikimedia Foundation through any other means during the covered reporting period. More on this under Child Safety Reports, under the “Requests for content alteration and takedown” section.
Out-of-court settlements
During the covered period, 0 disputes were submitted to a DSA dispute settlement body.
Complaints through internal complaint-handling systems
During the relevant period, we received 6 complaints through our internal appeals system. These claims were determined to be out of scope for the appeals process and were rejected, on average, within one business day.
Bans and suspensions
From time to time, the Wikimedia Foundation issues Global Bans, which bar an individual from continuing to contribute to the Wikimedia projects. In the vast majority of cases, we do not know the location in which these individuals are located, and whether or not the users in question are EU persons.
During the covered period of this report, we issued bans against 23 accounts for the provision of manifestly illegal content, including harassment and child exploitation material.
Automated content moderation
We are required to publish information about automated content moderation means, including qualitative descriptions and specified purposes of the tools, data on the accuracy of the automated tools, and descriptions of any safeguards applied.
The Foundation seeks to empower users to participate in content moderation processes by providing them access to machine learning tools that they can use to improve or quickly remove content–these tools are used and maintained by community members. While automated tools are used to support existing community moderation processes, the bulk of the work is still done manually.
The tools that editors can use include:
- ClueBot NG (for English Wikipedia), an automated tool which uses a combination of different machine learning detection methods and requires a high confidence level to automatically remove vandalism on the projects. Other bots similar to ClueBot include SaleBot (French, Portuguese), SeroBot (Spanish), and PatrocleBot (Romanian).
- Objective Revision Evaluation Service (ORES), which assigns scores to edits and articles in order to help human editors improve articles. ORES is available in multiple languages including several official languages of EU member states.
- Additionally, users with special privileges have access to the AbuseFilter extensions, which allows them to set specific controls and create automated reactions for certain behaviors.
Foundation Trust & Safety staff also use select automated tools to scan for child sexual abuse material. PhotoDNA, an automated tool, is used to identify known CSAM images and videos and to report them to the National Center for Missing and Exploited Children (NCMEC), a nonprofit that refers cases to law enforcement agencies around the world.
The Wikimedia community is highly effective at removing illegal and harmful content on the projects. In 2019, researchers at the Berkman Klein Center for Internet and Society at Harvard University found that the median amount of time harmful content (including harassment, threats, or defamation) remained on English language Wikipedia was 61 seconds.
Human resources
The vast majority of human resources devoted to content moderation come from our communities of independent volunteer editors, not from Wikimedia Foundation staff. Information about current active editors by language of Wikipedia, including official languages of EU member states, can be found on Wikistats.
The Wikimedia Foundation does employ a staff of Trust & Safety experts, who are available to address complex issues requiring resolution by the Foundation. Due to safety concerns, and the small size of this team, we are not able to provide detailed breakdowns of their backgrounds and linguistic expertise. The team collectively provides linguistic capacity in multiple languages used on the Wikimedia projects; for EU purposes, the most relevant languages covered would be English, French, and Polish. In some cases, Trust & Safety staff may liaise with volunteer editors with competence in other languages, and/or use machine translation tools, in order to investigate and address challenges in additional languages.
Average monthly EU recipients
In order to meet the requirements of DSA Article 24(2), the Foundation created a dedicated EU DSA Userbase Statistics page to provide a reasonable estimate of monthly “active”, “unique” human users of our main project families, across the EU, averaged over a 6-month period.
Other DSA information published by the Wikimedia Foundation relevant to this period
This section was added in November 2024; the rest of this Transparency Report is not affected.
During the first year of the DSA applying to Wikipedia, August 2023 through August 2024, the Wikimedia Foundation had its inaugural independent audit under the DSA regarding our compliance on matters such as complaint handling, transparency reporting, and risk assessment and mitigation. These audits are required annually going forward. Public versions of the annual Audit Report, along with the Audit Implementation Report, and latest DSA Systemic Risk Assessment and Mitigation (SRAM) documentation, can be found here.