As required by the European Union’s Digital Services Act (“DSA”), the Wikimedia Foundation is publishing data on requests received between 1 July and 31 December 2023. On 25 August 2023, the DSA’s main obligations began to apply to Wikipedia, four months after Wikipedia was designated as a Very Large Online Platform

As per our consistent practice for transparency reporting, and the DSA’s requirements, this report only includes information about Foundation-taken moderation actions, not actions taken by the global community of volunteer contributors who create and curate the information on Wikimedia projects. On a project such as Wikipedia, the community conducts almost all the content moderation. Readers wishing to gain a broader appreciation of moderation activity across the projects more generally are invited to browse the many data interfaces available for this purpose, such as Wikistats, and the Logs function for each project (e.g. the Logs for English Wikipedia).

Regarding countries of origin

The Wikimedia Foundation is committed to protecting the privacy of our users, both readers and editors alike. As such, we deliberately collect very little data about our users, practicing the principles of data minimization. The right to privacy is accordingly embedded into our projects and is at the core of how communities contribute to Wikimedia projects. We protect users’ data and foster free knowledge by allowing users to partake in knowledge sharing activities across our platforms without providing any personal details, including their country of origin. In the vast majority of requests we receive to alter content or hand over user data, we do not know the country in which the requester is located or of which they hold citizenship. This report represents a best effort to report on the requests in which this information was obvious or could be reasonably deduced from the content of the request (such as the requester citing a country-specific law).

Regarding volume of requests

Many of the following sections of this report indicate no relevant requests during the covered period. The Wikimedia Foundation always receives a low volume of requests. Future reports will likely show low but non-zero numbers for many DSA-responsive categories.

Orders from EU Member States

During the covered period, the Foundation did not receive any formal orders from Member States’ authorities, either in the form of “Orders to act against illegal content” (DSA Article 9) or “Orders to provide information” (Article 10).  Requests, including informal communications from EU-based government bodies or representatives, to provide user data or for alteration & takedown are found in the Transparency Report.

Notice and action submissions

70 Notices 0 Notices from trusted flaggers 3 Actions on the basis of law
58 Actions on the basis of terms & conditions 9 Pending resolution 10 days Median time for taking action

During the covered period, the Foundation received the following notices containing allegations of illegal content on the projects. The Foundation did not receive notices from trusted flaggers during the period covered in this report. Trusted flaggers will be appointed starting February of 2024, and any notices from trusted flaggers will be reflected in our reports moving forward. 

For all Wikimedia projects, the community oversees and manages the bulk of content moderation activities. In many cases, the Foundation refers requests we receive to the community and defers to community decisions about project content. When the community asks the Foundation for input, we sometimes provide guidance on relevant laws or project policies. In arriving at the median time for taking action for resolved notices, we consider when the notice was received, and the time it took to respond to it based on our internal review and determination.

CountryAlleged ClaimNotices
AustriaData protection and privacy violations
Illegal or harmful speech
Content in violation of the platform’s terms and conditions
1
1
1
Czech RepublicData protection and privacy violations1
EstoniaIllegal or harmful speech1
FranceData protection and privacy violations
Illegal or harmful speech
Intellectual property infringements
5
8
1
GermanyData protection and privacy violations
Illegal or harmful speech
Intellectual property infringements
Risk for public security
Content in violation of the platform’s terms and conditions
10
11
5
1
1
HungaryIntellectual property infringements
Content in violation of the platform’s terms and conditions
1
2
IrelandData protection and privacy violations1
ItalyData protection and privacy violations
Illegal or harmful speech
Intellectual property infringements
1
3
3
LithuaniaIllegal or harmful speech1
NetherlandsData protection and privacy violations3
PolandContent in violation of the platform’s terms and conditions1
RomaniaData protection and privacy violations
Illegal or harmful speech
1
1
SpainData protection and privacy violations
Illegal or harmful speech
1
2
SwedenIllegal or harmful speech
Violence
1
1

Content moderation by the Wikimedia Foundation

Copyright

The Foundation publishes its Digital Millennium Copyright Act (“DMCA”) Policy to inform individuals or entities about the process by which a takedown notice can be submitted to report incidents of copyright infringement uploaded on the projects. Rightsholders are provided with an easy-to-understand process so they can submit their claims for evaluation.

The Foundation thoroughly evaluates each DMCA takedown request to ensure that it is valid. We only remove allegedly infringing content when we believe that a request is valid, and we are transparent about that removal. If we do not believe a request to be valid, we will push back as appropriate.

During the covered period of this report, the Foundation received 1 DMCA takedown notice from Germany. 

Child safety actions

While child sexual abuse material (CSAM) has been found on Wikimedia projects, it is very rare. During the covered period of this report, the Wikimedia Foundation removed 162 files as actual or suspected CSAM through our standard reporting system.

No other content was removed by the Wikimedia Foundation through any other means during the covered reporting period. More on this under Child Safety Reports, under the “Requests for content alteration and takedown” section.

Out-of-court settlements

Between July and December 2023, 0 disputes were submitted to a DSA dispute settlement body.

Complaints through internal complaint-handling systems

During the relevant period, we received 2 complaints through our internal appeals system. These claims were determined to be out of scope for the appeals process and were rejected, on average, within one business day.

Bans and suspensions

From time to time, the Wikimedia Foundation issues Global Bans, which bar an individual from continuing to contribute to the Wikimedia projects. In the vast majority of cases, we do not know the location in which these individuals are located, and whether or not the users in question are EU persons.

During the covered period of this report, we issued bans against 12 accounts for the provision of manifestly illegal content, including harassment and child exploitation material. We also issued bans against 4 additional accounts for disinformation. 

Automated content moderation

We are required to publish information about automated content moderation means, including qualitative descriptions and specified purposes of the tools, data on the accuracy of the automated tools, and descriptions of any safeguards applied.

The Foundation seeks to empower users to participate in content moderation processes by providing them access to machine learning tools that they can use to improve or quickly remove content–these tools are used and maintained by community members. While automated tools are used to support existing community moderation processes, the bulk of the work is still done manually.

The tools that editors can use include:

  • ClueBot NG (for English Wikipedia), an automated tool which uses a combination of different machine learning detection methods and requires a high confidence level to automatically remove vandalism on the projects. Other bots similar to ClueBot include SaleBot (French, Portuguese), SeroBot (Spanish), and PatrocleBot (Romanian).
  • Objective Revision Evaluation Service (ORES), which assigns scores to edits and articles in order to help human editors improve articles. ORES is available in multiple languages including several official languages of EU member states.
  • Additionally, users with special privileges have access to the AbuseFilter extensions, which allows them to set specific controls and create automated reactions for certain behaviors.

Foundation Trust & Safety staff also use select automated tools to scan for child sexual abuse material. PhotoDNA, an automated tool, is used to identify known CSAM images and videos and to report them to the National Center for Missing and Exploited Children (NCMEC), a nonprofit that refers cases to law enforcement agencies around the world.

The Wikimedia community is highly effective at removing illegal and harmful content on the projects. In 2019, researchers at the Berkman Klein Center for Internet and Society at Harvard University found that the median amount of time harmful content (including harassment, threats, or defamation) remained on English language Wikipedia was 61 seconds.

Human resources

The vast majority of human resources devoted to content moderation come from our communities of independent volunteer editors, not from Wikimedia Foundation staff. Information about current active editors by language of Wikipedia, including official languages of EU member states, can be found on Wikistats.

The Wikimedia Foundation does employ a staff of Trust & Safety experts, who are available to address complex issues requiring resolution by the Foundation. Due to safety concerns, and the small size of this team, we are not able to provide detailed breakdowns of their backgrounds and linguistic expertise. The team collectively provides linguistic capacity in multiple languages used on the Wikimedia projects; for EU purposes, the most relevant languages covered would be English, French, and Polish. In some cases, Trust & Safety staff may liaise with volunteer editors with competence in other languages, and/or use machine translation tools, in order to investigate and address challenges in additional languages.

Average monthly EU recipients

In order to meet the requirements of DSA Article 24(2), the Foundation created a dedicated EU DSA Userbase Statistics page to provide a reasonable estimate of monthly “active”, “unique” human users of our main project families, across the EU, averaged over a 6-month period.

Help us unlock the world’s knowledge.

As a nonprofit, Wikipedia and our related free knowledge projects are powered primarily through donations.

Donate now

Contact us

Questions about the Wikimedia Foundation or our projects? Get in touch with our team.
Contact

Photo credits