Painting by Harry Wilson Watrous, public domain/CC0.

Over the past 15 years, Wikimedians have collaboratively built some of the most amazing projects on the Internet and for free knowledge. Editors on Wikipedia, contributors to Commons, and administrators on other sites are united in their goals of collecting human knowledge and making it accessible and reusable for free and for everybody in the world. They work hard towards this goal, contributing an impressive amount of time and effort.

Wikimedians not only want to collect knowledge, they also want to get that knowledge right. They care a lot about the factual quality of the information on Wikimedia projects—about complying with copyright (e.g. for images that illustrate Wikipedia), about freedom of expression, about neutrality and the strength of underlying sources. In order to collaboratively build and improve content, Wikimedians discuss their views on talk pages, on email lists, and in 1-to-1 conversations. Naturally, when editors, administrators, and other contributors disagree about certain issues, they will argue. Often times, they have passionate debates, fiercely defending their point of view. And at times their disagreements escalate in ways that will lead Wikipedians to use harsh words or be abusive to each other. In some cases, however, bad behavior seemingly comes out of nowhere, for instance when users personally attack or troll others or engage in acts of vandalism. These are issues on the Internet that have been written about in various places and have been researched in our community. The existence of these problems on Wikipedia is something that we are not proud of.

However, this is not a phenomenon that only exists in Wikimedia discussions: many websites that facilitate user contributions or comments see harsh conversations and personal attacks among users. As most of our communication moves online, including important democratic discourse, speech that threatens sincere conversations and debates increasingly becomes a problem. That’s why we are pleased to see that there are different initiatives that seek to address the issue of harassment online. We try to learn from those initiatives and hope they will succeed. At the same time, we know that we cannot rely on the work of others to make sure that the Wikimedia projects are safe for everyone to access and contribute to free knowledge. Rather, we are determined to create a friendly space ourselves where people can gather to collect encyclopedic information and educational content.

There are several connected reasons for us to do this. An unfriendly or even toxic environment can be an impenetrable barrier for access to knowledge. We cannot expect people to join our movement and contribute to our mission of collecting free knowledge if they don’t feel comfortable on our websites. Yet, in order to build an exhaustive encyclopedia that covers diverse views and perspectives, we need as many people as possible to contribute to Wikipedia and our other Projects. In today’s world, people have many options for spending their free time and negative experiences would seriously threaten the success of our movement’s work. It has also been argued that there is an ethical obligation for platforms to protect their users from abusive behavior through community management. Finally, we also believe that productivity is diminished by a harsh tone and especially by polemic and aggression.

While it is clear to us that many of these reasons deserve further research, we also recognize that one challenge to our intention is finding the right balance between promoting free speech and curbing harassment. Wikimedia’s values build on democratic decision-making and collaboration. So we started the process of developing principles for interaction on the Projects by asking the community for input. At this year’s Wikimania, the annual gathering of the global Wikimedia movement, together with roughly fifty participants, we discussed Wikimedians’ experiences with existing codes of conduct and policies on Wikipedia, Commons, etc. We discussed participants’ expectations for communication on- and off-Wiki and collected recommendations for behavior in arguments and disputes over facts and compliances with guidelines on the Wikimedia sites.

Five patterns have emerged:

  1. Offer constructive criticism. Offer options.
  2. Treat people as you would like to be treated. No personal attacks. Be empathetic.
  3. Re-read your contributions. Be patient. Think: this is how x makes me feel.
  4. If you see something bad, say something.
  5. Connect on human level. Apologize. Get off-Wiki for a second. Rewind.

We believe these principles for interaction can help us create a friendly space for all contributors and newcomers alike. The Wikimedia Foundation is taking this issue very seriously and working on developing better training for volunteers to discourage abuse and better resolve disputes; you can participate in that project on Meta. We invite you to discuss these principles with your community and to let us know what you think about them in the comments below. This is only the start of a larger conversation that we need to have in order to ensure the continued success of the Wikimedia projects and access to knowledge for everyone.

Patrick Earley, Senior Community Advocate (International)
Jan Gerlach, Public Policy Manager
Wikimedia Foundation

Related

Read further in the pursuit of knowledge

Community From the archives Offline access Wikipedia

Offline-Pedia converts old televisions into Wikipedia readers

There are villages in the Ecuadorian Andes that are so small you cannot find them on a map. Cajas Juridica is one such place, located just 13km north of the equator. But two engineering students, Joshua Salazar and Jorge Vega, and the staff of Yachay Tech University have figured out a way to give discarded….

Community From the archives Interview Profiles Wikipedia

Meet the scientist working to increase the number of underrepresented scientists and engineers on Wikipedia

By day, Dr. Jess Wade is a physicist best known for her work on “polymer-based, circularly polarising, light-emitting diodes.” But in the evenings (and on the weekends, and as other time permits) Dr. Wade is a strong advocate for increasing diversity and inclusion in STEM subjects, speaking at conferences and starting a campaign on Wikipedia to promote more early-career women….

Community Foundation From the archives Wikipedia

New interaction timeline improves investigation of harassment cases

The new interaction timeline tool is a way to look at two contributors’ editing history—where they have interacted, when, and how often. This can help add clarity when reviewing reports of harassment and abuse, and takes some of the burden off both the people reviewing problems, and the people reporting them.

Help us unlock the world’s knowledge.

As a nonprofit, Wikipedia and our related free knowledge projects are powered primarily through donations.

Donate

Connect —

Stay up-to-date about the Wikimedia Foundation

Get email updates

Subscribe to news about ongoing projects and initiatives.

Contact a human

Questions about the Wikimedia Foundation or our projects? Get in touch with our team.

Photo credits

watrous_discussion

Offline-Pedia-screenshot

University Yachay Tech

CC BY-SA 4.0

17_350-icl-jwade-024

Jess Wade

CC BY-SA 4.0

matthew-henry-86779-unsplash