News

Wikimedia panelists tackle the future of intermediary liability

Scholars and practitioners discussed the rules for when and how online platforms are held legally responsible for what their users post at Wikimedia’s July 11th legal fellow panel.

When should companies that run websites be required to delete what you post online? At Wikimedia’s 11 July summer legal fellow panel, “The Future of Intermediary Liability,” legal scholars and practitioners discussed the rules for when and how online platforms are held legally responsible for what their users contribute.

Under Section 230 of the 1996 Communications Decency Act (CDA), websites are protected from liability for content posted by third parties as long as it does not infringe on federal intellectual property or violate federal criminal law. If a contributor posts hate speech, for example, the online platform hosting the content is not liable under American jurisprudence.

The Wikimedia Foundation considers CDA 230 essential to the internet’s power for global exchange and free speech. The law empowers anyone to contribute freely and engage with the sum of all knowledge. Without CDA 230, many online platforms would no longer exist, and those that remained would not be able to maintain flexible models for creating content, such as Wikipedia’s community editing model. Only those with the funds to fend off expensive litigation and provide rigorous content moderation would survive.

One recent carve-out surrounding intermediary liability has raised concerns. The Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA, also known by its previous title, SESTA), weakens CDA 230’s immunity with respect to certain content involving sexual exploitation. Because the law was vaguely written, it incentivizes tech companies to avoid risk by shutting down certain platforms or pages.

Wikimedia’s legal fellow panel featured Director of Intermediary Liability at Stanford’s Center for Internet and Society Daphne Keller, Automattic General Counsel Paul Sieminski, Berkeley Law Professor and Director of Berkeley’s Center for Law and Technology Peter Menell, and Wikimedia Legal Counsel Leighanna Mixter.

The panelists discussed how lawmakers are justifiably frustrated by ugly content online, but that much of that content is protected under the First Amendment in the United States and other freedom of expression guarantees in government charters around the world. Changing intermediary liability rules is not the best way to address such protected content. “A lot of the content that is a genuine social problem is not something that the law can correct for,” Keller said at Wikimedia’s event.

Government pressures to curb freedom of speech online typically push platforms to shut down more speech than is actually required, Sieminski said. Requiring platforms to use automated content filters and to ensure that particular content is removed within a certain time frame is especially troubling, he noted, given how many “gray areas” platforms encounter on a daily basis. He said that in practice, these measures lead platforms to remove content that is actually legal for users to post.

When communicating with policymakers about the difficulty of these decisions, Sieminski said one effective strategy was to show them the last ten content moderation choices his company had to make, demonstrating the importance of human judgment (and therefore the need for flexible timeframes) to evaluate context and content online.

Mixter said that the existing law “currently does strike an important balance” between freedom of speech and platform liability. Menell countered that although First Amendment considerations are important for online content moderation, policymakers need to consider more “unconventional” solutions to nudge profit-driven platforms in the right direction. He suggested finding ways to incentivize whistleblowers to come forward with rogue activity and to ensure that platforms are held accountable for “deceptive communication” about how they handle user content.

Panelists proposed that future attacks on CDA 230 may center on social harms like guns, drugs, and terrorism. Mixter noted that these are fundamentally human problems now mediated through the internet. As a result, she proposed that tech companies proactively engage with policymakers and relevant nonprofits to address “the root of the problem” instead of further reducing online intermediary protections.

In discussing other possibilities for future regulation, Keller invoked the Manila Principles on Intermediary Liability, a set of principles endorsed by global civil society organizations, to explain that platforms should provide more transparency on how they take down content. She also suggested that lawmakers distinguish between small and large platforms given their differing content moderation resources.

Finally, Mixter noted that a large portion of Wikimedia’s content takedown requests is handled by its user community, facilitating transparency and accountability.

“Trying to let the community lead on these issues is critical,” she said.

Anna Windemuth, Rachel Brown, Yuan Tian, Imogen Sealy, Legal Fellows
Wikimedia Foundation

Thank you to our panelists for the lively discussion, to the Wikimedia legal team for their support and guidance, and to Cloudflare for sharing their event space with us.

Want to become a legal fellow?

Wikimedia’s legal fellowship program has helped shape an international community of law students and recent graduates interested in intellectual property, privacy, and other cyberlaw issues. The Foundation recruits legal fellows for the spring, summer, and fall. All law students and recent graduates are welcome to apply when positions open.

Related

Read further in the pursuit of knowledge

Yale Law School and the Wikimedia Foundation create new research initiative to help preserve and protect the free exchange of information online

The Wikimedia Foundation and the Information Society Project (ISP) at Yale Law School recently expanded their longstanding collaboration to focus on raising awareness and conducting research related to threats against intermediary liability protections. Those protections are necessary if online platforms are to remain neutral third parties hosting user-generated content.

Read more

Help us unlock the world’s knowledge.

As a nonprofit, Wikipedia and our related free knowledge projects are powered primarily through donations.

Donate now

Contact us

Questions about the Wikimedia Foundation or our projects? Get in touch with our team.
Contact

Photo credits

Intermediary liability panel

Wikimedia Foundation

CTA_loop_junction