An important new data set about online governance

I spent most of 2024 setting up the Applied Social Media Lab at Harvard’s Berkman Klein Center for Internet and Society. One of my favorite projects from the lab went public on March 4: the Transparency Hub, a huge collection of online policy documents from over 300 social media platforms. Early research results reveal that expanding use of mandatory arbitration clauses and class-action bans are effectively setting up a parallel justice system.

Policy documents are the legal codes of the Internet. We are forced to click to accept lengthy terms of service (which nobody reads) in order to start using each platform. After that, we lose all agency. Platforms reserve the right to change the policies we must obey at any time without our consent, and often without even notifying us.

The Transparency Hub brings this new legal code to light. The ASML team has compiled over 20,000 policy documents from more than 300 social media platforms. It contains ten years of terms of service, privacy policies, community guidelines, transparency reports, and AI and data governance disclosures. The ASML team plans to continue updating the database as platforms update their policies.

The Transparency Hub website is public and freely available now. As well as letting you search, view, and download from their archive in multiple formats, you can compare a company’s policy documents over time, to discover what has changed. Full database access is also available for qualified researchers.

You can learn more about the Hub by watching the recording of its launch event on YouTube.

Platforms are forcing us to surrender our rights

A team of undergraduates from Boston University’s Spark! program dug into the data and discovered some unsettling things. They described their research live at the Transparency Hub launch event, and have shared visualizations of their findings on the Hub website. The team also shared a more detailed report with me privately, which I draw on below.

Their most concerning finding is that platforms are subverting the rule of law in order to impose their own dispute resolution rules. Mandatory arbitration clauses and class action bans have become widespread, and are now part of over 626 policies. As the BU Spark! team writes, “This is not a niche legal trend: it is now standard industry practice.”

A quick refresher on these terms:

  • Mandatory arbitration means you cannot take the company to court. Instead you must settle any disputes with a private arbitrator who will consider both sides of the situation and issue a ruling. Arbitration findings are usually confidential and final (no appeals).

  • A class action ban means you must handle your dispute with the company as an individual, and cannot combine with others in the same situation.

Here are a few highlights of how different platforms have rolled this out.

  • These clauses vary by region - TikTok and WhatsApp specifically say their clauses apply only to US or Canadian users, and Pinterest added an EEA/UK carve-out in 2023.

  • Meta has made it harder to discover their arbitration ban. Around 2021, they moved WhatsApp’s ban from an all-caps warning into an innocuous-sounding subsection. Instagram moved it entirely out of their specific terms and into the broader Meta terms of service.

  • TikTok and Snapchat have become more restrictive over time. TikTok added both clauses in 2022; Snapchat required arbitration since 2018 and added a class action ban in 2023.

  • YouTube gets top marks for not including any mandatory arbitration or class action ban. [Note that other parts of Google, such as devices, do.]

Platforms should not have more power than people

These clauses explicitly subvert my Magna Carta principle: Online platforms should be subject to the laws of the communities they serve, not the will of their owners.

It is hopefully self-evident that class action bans are a blatantly pro-company policy. Platforms who impose this are effectively banning you from fighting for your rights as part of a community. This enshrines the vast power imbalance a huge global corporation has against any one person. If a country tried to prevent activist groups from using the courts, we’d condemn it as undemocratic at best, repressive at worst. 

Arbitration, on the other hand, is sometimes seen as beneficial for individuals due to its speed. Civil court cases can take months, or even a year or more, to resolve. Arbitration moves much faster, so for something time-sensitive like an account deactivation, arbitration might be worth it.

This misses other ways that arbitration favors companies over civil courts, as this article points out. Since arbitrators are paid by the parties in the case, there will be an incentive to find for the company to preserve their business. If a company keeps losing with one arbitrator, they will probably look for another one who might be more favorable. Arbitration is also secret by default, so unlike public courts, there is no way to learn what cases are being brought against the company, how they are defending themselves, or what the results have been.

“Accept or leave” is not a fair choice

Platforms will retort, of course, that no one is being forced into this. We are choosing to accept these terms by signing up for the service. If we don’t like the terms, we don’t have to use the platform.

This logic might make sense for an interchangeable product with viable substitutes, like a power tool. But social media platforms are gateways to unique communities of people creating information, sharing ideas, supporting each other, and making connections. The company didn’t build our communities in a factory, they’re just hosting them. Restrictive terms of service hold our communities hostage until we submit to an alternative justice system.

I am not trying to ban arbitration. I acknowledge that it can be a useful form of dispute resolution. If we were only talking about fact-based issues with basic service provisioning, like Meta accidentally deleting your data due to a bug, arbitration might make sense.

I cannot accept mandatory arbitration applied to the welfare of a community. Social media disputes over policy enforcement, malicious ads, addictive patterns, harms to children, and so forth are complex questions of governance and ethics that do not have clear answers yet. These debates and disputes need to be public and include the community. A business should not be able to hire its own judges to handle these cases behind closed doors.

[Caveats - calling for public proceedings does not mean forcibly exposing sensitive cases. As I wrote before, there are procedures for handling this in the justice system. Also, it is possible that a given online community could still choose to use arbitration for dispute resolution, but that should be something the community consents to, not imposed by the platform vendor.]

Transparency is good

I could continue this rant for a while, but I want to bring focus back to the Transparency Hub. The documents and data this team have complied help us have an informed debate over vital issues like forced arbitration and class action bans. I hope that the ASML Transparency Hub will be around for a long time, and that some of you reading this will learn more from their dataset and share it with the world.

Ideas? Feedback? Criticism? I want to hear it, because I am sure that I am going to get a lot of things wrong along the way. I will share what I learn with the community as we go. Reach out any time at [email protected].

Keep Reading