Interaction principles for online collaboration

Painting by Harry Wilson Watrous, public domain/CC0.

Over the past 15 years, Wikimedians have collaboratively built some of the most amazing projects on the Internet and for free knowledge. Editors on Wikipedia, contributors to Commons, and administrators on other sites are united in their goals of collecting human knowledge and making it accessible and reusable for free and for everybody in the world. They work hard towards this goal, contributing an impressive amount of time and effort.

Wikimedians not only want to collect knowledge, they also want to get that knowledge right. They care a lot about the factual quality of the information on Wikimedia projects—about complying with copyright (e.g. for images that illustrate Wikipedia), about freedom of expression, about neutrality and the strength of underlying sources. In order to collaboratively build and improve content, Wikimedians discuss their views on talk pages, on email lists, and in 1-to-1 conversations. Naturally, when editors, administrators, and other contributors disagree about certain issues, they will argue. Often times, they have passionate debates, fiercely defending their point of view. And at times their disagreements escalate in ways that will lead Wikipedians to use harsh words or be abusive to each other. In some cases, however, bad behavior seemingly comes out of nowhere, for instance when users personally attack or troll others or engage in acts of vandalism. These are issues on the Internet that have been written about in various places and have been researched in our community. The existence of these problems on Wikipedia is something that we are not proud of.

However, this is not a phenomenon that only exists in Wikimedia discussions: many websites that facilitate user contributions or comments see harsh conversations and personal attacks among users. As most of our communication moves online, including important democratic discourse, speech that threatens sincere conversations and debates increasingly becomes a problem. That’s why we are pleased to see that there are different initiatives that seek to address the issue of harassment online. We try to learn from those initiatives and hope they will succeed. At the same time, we know that we cannot rely on the work of others to make sure that the Wikimedia projects are safe for everyone to access and contribute to free knowledge. Rather, we are determined to create a friendly space ourselves where people can gather to collect encyclopedic information and educational content.

There are several connected reasons for us to do this. An unfriendly or even toxic environment can be an impenetrable barrier for access to knowledge. We cannot expect people to join our movement and contribute to our mission of collecting free knowledge if they don’t feel comfortable on our websites. Yet, in order to build an exhaustive encyclopedia that covers diverse views and perspectives, we need as many people as possible to contribute to Wikipedia and our other Projects. In today’s world, people have many options for spending their free time and negative experiences would seriously threaten the success of our movement’s work. It has also been argued that there is an ethical obligation for platforms to protect their users from abusive behavior through community management. Finally, we also believe that productivity is diminished by a harsh tone and especially by polemic and aggression.

While it is clear to us that many of these reasons deserve further research, we also recognize that one challenge to our intention is finding the right balance between promoting free speech and curbing harassment. Wikimedia’s values build on democratic decision-making and collaboration. So we started the process of developing principles for interaction on the Projects by asking the community for input. At this year’s Wikimania, the annual gathering of the global Wikimedia movement, together with roughly fifty participants, we discussed Wikimedians’ experiences with existing codes of conduct and policies on Wikipedia, Commons, etc. We discussed participants’ expectations for communication on- and off-Wiki and collected recommendations for behavior in arguments and disputes over facts and compliances with guidelines on the Wikimedia sites.

Five patterns have emerged:

  1. Offer constructive criticism. Offer options.
  2. Treat people as you would like to be treated. No personal attacks. Be empathetic.
  3. Re-read your contributions. Be patient. Think: this is how x makes me feel.
  4. If you see something bad, say something.
  5. Connect on human level. Apologize. Get off-Wiki for a second. Rewind.

We believe these principles for interaction can help us create a friendly space for all contributors and newcomers alike. The Wikimedia Foundation is taking this issue very seriously and working on developing better training for volunteers to discourage abuse and better resolve disputes; you can participate in that project on Meta. We invite you to discuss these principles with your community and to let us know what you think about them in the comments below. This is only the start of a larger conversation that we need to have in order to ensure the continued success of the Wikimedia projects and access to knowledge for everyone.

Patrick Earley, Senior Community Advocate (International)
Jan Gerlach, Public Policy Manager
Wikimedia Foundation


Read further in the pursuit of knowledge

Wikipedia’s value in the age of generative AI

If there was a generative artificial intelligence system that could, on its own, write all the information contained in Wikipedia, would it be the same as Wikipedia today?

Read more
Gagea minima flower

Making Wikipedia Safer: Enforcement Guidelines for Wikimedia Universal Code of Conduct provide a model for governing online behavior

The Board of Trustees of the Wikimedia Foundation has unanimously approved enforcement guidelines for the first-of-its kind Universal Code of Conduct (UCoC). The code, which was developed hand in hand with more than 1500 Wikimedia project volunteers, outlines global standards for behavior across Wikipedia and all Wikimedia projects to prevent and address harassment and other negative behavior.

Read more

Stephen LaPorte becomes the new General Counsel for the Wikimedia Foundation

The Wikimedia Foundation announced Stephen LaPorte as General Counsel. Stephen was previously Deputy General Counsel at the Foundation; he has held several leadership roles in the Legal department, covering a range of legal issues including copyright, trademarks, governance, and public policy.

Read more

Help us unlock the world’s knowledge.

As a nonprofit, Wikipedia and our related free knowledge projects are powered primarily through donations.

Donate now

Contact us

Questions about the Wikimedia Foundation or our projects? Get in touch with our team.

Photo credits