How many times did you look up something on your phone today? Did you ask ChatGPT a question? How about Alexa or Siri or a social media site?
Receiving immediate responses is a huge benefit of how this technology has improved our lives. But it has also made it harder to sort through a flood of information to make sure we are getting the most accurate and reliable answers. The overwhelming speed of change in today’s online information ecosystem makes it more urgent to have a place for trustworthy and verified facts.
Wikipedia was created more than 20 years ago with that goal in mind. Edited by nearly 300,000 volunteers globally, it now receives more than 15 billion visits each month. Wikipedia sees the same (if not higher) levels of global traffic as well-known, for-profit internet companies at a fraction of the budget and staffing. It’s the only top ten most visited website hosted by a nonprofit organization, the Wikimedia Foundation.
Since becoming CEO of the Wikimedia Foundation last year, I’ve asked hundreds of people all over the world how they think Wikipedia works. This usually leads to a conversation where someone says:
“I sometimes see that message asking for donations, but I hadn’t thought about the fact that there are no ads until now.”
“I had no idea Wikipedia was supported by a non-profit.”
“I use Wikipedia every day. I can’t imagine a world in which it doesn’t exist.”
They usually leave the conversation understanding why the Wikimedia Foundation’s work is vitally important for ‘the encyclopedia that anyone can edit’ to remain freely available to people everywhere.
The Wikimedia Foundation does four critical things to make sure Wikipedia can get closer to its vision of representing the sum of all knowledge. (1) We provide a highly sophisticated technology backbone that keeps Wikipedia secure, fast, and accessible all over the world; (2) we innovate in the latest technologies to deliver accurate, up-to-date Wikipedia content to you, even when you are using other sites online; (3) we help fight misinformation, disinformation, censorship, and other threats; and most importantly, (4) we support volunteers in all regions of the world to build thriving communities of editors and contributors. These people brought Wikipedia to the world more than 20 years ago with a radical belief that humans remain at the core of realizing technology’s promise.
What does all of this take?
#1: A sophisticated technology backbone to keep Wikipedia secure, fast, and accessible
It may surprise you to learn that Wikipedia is regularly recognized as one of the fastest sites in the United States. Fast and reliable access to Wikipedia’s website should not have to depend on where you live. The Wikimedia Foundation continues to grow this technology backbone to deliver a similar experience to users across the Middle East, Africa, South America, Asia, and Europe.
This essential infrastructure has expanded over the years to handle extreme spikes in global traffic. These spikes can happen when there is a significant newsworthy event, such as when a famous person dies. In these moments, we see countless other sites begin to simultaneously pull up-to-the-second information from Wikipedia because it is the source they trust. This in turn creates increased pressure on our technology backbone to keep the site up and running when people need it most. Our engineers pride themselves on making sure Wikipedia doesn’t go down.
We manage to do this with two data centers, four caching centers, and over 30 internet peering and transit connections, all supported by about two thousand servers (we run our own servers for lots of reasons, but especially to protect user privacy). This supports the website and also other digital properties like mobile apps.
But the real investment is in supporting our hundreds of engineers. They write complex code in the open, make hard trade-offs to balance spikes in incoming traffic, and add new databases when needed. They handle the often invisible but critical maintenance of software from reducing memory consumption to fixing bugs to removing code that threatens the security and safety of our systems.
The Wikimedia Foundation must continue to invest in the security, speed, and reliability of Wikipedia. This lean and highly sophisticated backbone is operated by mission-driven technologists who are utterly dedicated to making sure Wikipedia is always up and running for the billions of visitors that have come to depend on it as always being a click away.
#2: Making Wikipedia content available anywhere on the internet
Most people I meet don’t know that the content they use all over the internet comes from Wikipedia, even if they never visit our website. Where does Google get the link to answer your query? Have you ever asked Siri or Alexa where they found the answer? Do you know that ChatGPT and similar tools are all trained on Wikipedia’s data?
This phenomenon was captured well in a recent New York Times Magazine story that described Wikipedia as “a kind of factual netting that holds the whole digital world together.” Search engines depend heavily on Wikipedia’s up-to-date articles; video sites point users to Wikipedia to learn more information; and AI chatbots regularly pull from Wikipedia in generating their responses. How does Wikipedia keep up, while staying true to our purpose and values?
It’s not easy, and this drives a lot of the growing investments we are making now at the Wikimedia Foundation. We are doubling down on protecting user data and privacy, bucking many industry trends. We are doubling down on keeping our content available at no cost to everyone, everywhere, under what is known as a free license. And most importantly, we are doubling down on a belief that high-quality, human-generated content is going to be irreplaceable for generative AI tools like ChatGPT.
We’ve been reflecting a lot on this last topic. As longtime Wikipedia watcher and Slate reporter Stephen Harrison put it: “the implementation of A.I. technology will undoubtedly alter how Wikipedia is used and transform the user experience. At the same time, the features and bugs of large language models, or LLMs, like ChatGPT intersect with human interests in ways that support Wikipedia rather than threaten it.” At the Wikimedia Foundation, this has meant continuously investing in AI and machine learning, while always making sure that humans remain a central part of the equation.
Earlier this year, the Wikimedia Foundation introduced an experimental plugin that tells ChatGPT to search for and summarize the most up-to-date and fact-checked information on Wikipedia when it answers information queries. The plugin also attributes answers to Wikipedia with links for further reading and learning. This innovation builds off of growing AI-based technology that has been part of the Wikimedia Foundation’s history for many years.
Another area for increased investment is in translation tools that the Foundation has created to help volunteer editors translate articles across languages. As the most multilingual digital enterprise in the world, Wikipedia and its sister projects support content creation in more than 300 languages.
Meeting Wikimedia’s global mission requires ongoing creativity and innovation in translation across languages and cultural contexts. This started years ago with a content translation tool that is regularly maintained and improved; it has been used to translate more than 1.2 million of the nearly 62 million Wikipedia articles so far.
We added resources this year to launch this into a translation service called MinT (“Machine in Translation”) that is designed to support underserved languages that are using machine translation for the first time. MinT adds bi-directional translation between 155 languages to Wikipedia using an open source language translation model, including support for 44 languages for which MinT is the first and only translation tool available in the world. For example, the latest update to MinT’s capabilities is the addition of Cherokee, Tongan, Hungarian, Kazakh, Kyrgyz, Minangkabau, and Sardinian Wikipedias — with Cherokee, Tongan, and Sardinian being served for the very first time by any open source tool. This has greatly simplified the process for editors who translate content to and from these languages. Another example is a recent update supporting machine translation in Fula for the first time, a language spoken by around 35 million people in West and Central Africa.
Alongside all of this, we have year-in-year-out costs that are required to keep Wikipedia’s ‘factual netting’ healthy and strong. Recently, this has meant making user-guided improvements to the usability of our website for readers. And prioritizing the needs of volunteer editors and technical contributors globally — ranging from customized software, personalized tools, specific bug fixes, and sometimes individualized patches across 300+ languages and in all regions of the world!
This is why we readily spend most of our roughly $175 million budget on growing teams of world-class engineers, designers, product managers, researchers, and analysts who are up to this monumental task: building a world in which every human being can share in the sum of all knowledge.
#3: Fighting mis/disinformation, censorship, and other threats
Most of us have seen or experienced first-hand the negative consequences of misinformation, polarization, and censorship online. These harmful realities, along with threats to our personal data and privacy, often leave us to fend for ourselves. For me, that’s why Wikipedia’s goal to provide evidence-based, unbiased, and free information for everyone has never been more urgent.
Wikipedia’s volunteers are the world’s first line of defense. I recently told government leaders that the day-to-day process of building and improving Wikipedia requires these contributors to collaborate, debate, and discuss their edits in order to write thoughtful, informative articles. They hold themselves to high standards of reliability, verifiability, and neutrality by providing citations and sources. On the “Talk” page of every Wikipedia article, they weigh multiple perspectives in the open so that they can make good faith decisions about content together. And they set and enforce rules for what does and doesn’t belong on the Wikimedia projects, guided by a Universal Code of Conduct and supported by the Wikimedia Foundation’s commitment to human rights standards.
This requires expanding our legal, policy, and advocacy strategies to push back against a trend of increasing authoritarianism and government censorship (including blocks of Wikipedia itself, which we helped overturn in Turkey); promoting responsible regulations to support open access to knowledge in legislation like the Digital Services Act; and when necessary, defending volunteers in countries where contributing to Wikipedia remains an act of bravery.
In the next few years, the world will experience consequential elections alongside increased volatility and social unrest being caused by climate and conflict. We see that it is getting harder to ensure that technology serves people, not the other way around.
I believe that this work of the Wikimedia Foundation — promoting the values of open and equitable access to knowledge to people and societies everywhere — must be supported now more than ever before.
#4: Supporting volunteers to build thriving communities of contributors
The Wikimedia Foundation is part of an extensive ecosystem of communities that also includes local chapters representing countries, user groups of volunteers with common interests, allied partners who advocate for open knowledge, and individuals editing Wikipedia who often have no idea that any of this even exists behind the online platform.
One of the most important tasks of the Wikimedia Foundation is to share the financial support we receive with these individuals, groups, and organizations around the world to collectively build thriving communities of contributors. This requires operating a very complex administrative and financial infrastructure that can fund 90+ countries – one that is annually given the highest possible ratings from independent watchdogs like Charity Navigator.
With the guidance of volunteer committees, we balance funding priorities between deeper innovations in more established regions with high-scale growth efforts in newer communities like Asia, Latin America, and Sub-Saharan Africa. In addition, closing what we call ‘knowledge gaps’ is a strategic goal of our movement; just one example of this is the collective efforts of countless individuals and organizations to increase the representation of women’s biographies on Wikipedia.
The goal of this work is to invite anyone who shares our vision and values to join us. This extends from welcoming newcomers to supporting more established editors; it can take the form of a small donation to an individual to a large, multi-year grant enabling a chapter to grow its local activities; and it can support partnerships with hundreds of educational and cultural institutions around the world.
It means meeting people where they are, and not expecting them to find us. I think about recent grants that have supported co-creating open knowledge projects with the Atikamekw First Nation in Canada; addressing gender gaps with US-based Art+Feminism; edit-a-thons in Japan; the development of Kyrgyz Wikipedia in Central Europe; building the base of Wikimedia contributors in Nigeria; and helping teachers use Wikipedia in the classroom in Morocco.
The people who do all this can’t be seen on your computer screen, but they power the human world of Wikipedia, one that makes everything else I’ve talked about here possible.
I hope this explanation helps you to better understand what the Wikimedia Foundation does, especially when we ask you to donate.
By design, we don’t only ask a privileged few to write us big checks. That’s because Wikipedia belongs to everyone, and why people are asked to contribute what they can if they’ve found it useful. This funding, given by only 2% of readers, helps keep the site ad-free and independent.
As you’ve read, the Wikimedia Foundation has grown to meet technical, geographic, and social changes that are only accelerating their pace of change. Alongside today’s investments, we are also planning for the future – by doing things like growing an endowment to accelerate technical innovation and making big bets to reimagine the role of language on the internet. If you agree that this work is important, please consider supporting the Wikimedia Foundation.
Wikipedia is an encyclopedia, representing the best of human knowledge. It is not a social media platform or an opinion page. Nothing quite like it exists anywhere. And it belongs to all of us.
Maryana Iskander is Chief Executive Officer of the Wikimedia Foundation.
If you’d like to support our work, you can make a donation at donate.wikimedia.org.