A behind the scenes look at the Wikimedia Foundation’s emergency response system

When my work phone rings in the middle of the night, I don’t ever have to wonder what it is. As one of the team of five who are in charge of addressing emails sent to the Wikimedia Foundation’s emergency address system, I sleep lightly on nights when I’m on call. Fortunately, emergencies don’t happen every night, but when they do I need to be prepared for the threat I’ll face – school bombing? Violence against another user? Threat of suicide? Maybe it’ll just be spam or vandalism and I’ll be back in bed in a few minutes, but I have faced all of the preceding and more. On the emergency team, we all have.

Threats of violence or self-harm are a sad, but luckily relatively uncommon, event on our projects. The English Wikipedia community has developed a process for handling them, as have a number of other projects. The emergency email address serves to help protect the public and users of Wikipedia, and the community advocacy team responds to these as part of our regular duties. We process threats of violence against self and others posted on WMF sites, running them through a protocol developed in consultation with the FBI. When a threat is credible and imminent according to the reporting criteria, we pass it along to federal or local authorities. This has brought us into contact with law enforcement around the world. We don’t always know how these reports resolve, which can leave us feeling very unsettled. Sometimes, though, it’s even more unsettling when we do.

The emergency system was established in 2010 by Philippe Beaudette (now Director, Community Advocacy), who had experience managing and creating processes for “Trust and Safety” issues with other companies and communities. He was joined in managing incoming issues by James Alexander and Christine Moellenberndt. In consultation with other staff and from years of experience, Philippe has worked to create a functional system that is manageable for the global scope of the Wikimedia projects.

What we’re looking for, primarily, is specificity and plausibility. “Block me, and I’ll kill you,” when authored by a vandal to an admin operating anonymously under a pseudonym, is neither. If a user in Philadelphia edits an area school article threatening to kill a teacher, it’s both. If we report on any private data, such as IP addresses behind accounts, we fill out a form that logs such instances. (We report any data that we have that may facilitate the rapid response of officials to these incidents, consistent with our privacy policy.) We follow this up by annotating the outcome to the other members of the team and filing a report, in our case management software, SugarCRM.

Many times during this process, we need to reach out for assistance. Sometimes reports come in from languages where we have no proficiency, and we need to find staff or volunteers to help us translate the threat and understand the context in which it was placed. Sometimes we need local administrators to help address the incidents on the projects, for instance blocking a user or oversighting content as appropriate under local policies.

Sometimes people misunderstand or misuse the system. While we are here to evaluate any threat of harm to self or others, we are sometimes contacted by people who are unhappy with the content of articles or with disputes they are having with other editors. As a matter of strict procedure, we do not assist with off-topic messages sent to this address; we don’t even forward them to other channels. We cannot afford the dilution of the emergency response system, and appropriate avenues for outreach for these kinds of problems are widely publicized on the projects themselves.

Over the years that I have worked for the Wikimedia Foundation, this process has continually improved. We have received training in dealing with emergencies and with decompressing afterwards, and the tools we use to handle them have been refined to make the whole process quick and efficient. But regardless of training, it remains a challenging experience. We seldom get a sense of closure. We are all aware of the possibility that no matter how quickly we respond, we may be too late. Even finding out that we were able to help can be distressing, because there was need for us to do it at all. Never mind the stress; it is a responsibility we shoulder willingly. It’s worth it for even a chance to help protect people and save lives.

Maggie Dennis, Senior Community Advocate

Help us unlock the world’s knowledge.

As a nonprofit, Wikipedia and our related free knowledge projects are powered primarily through donations.

Donate now

Contact us

Questions about the Wikimedia Foundation or our projects? Get in touch with our team.

Photo credits