Ahead of yet another US political grilling, Facebook CEO Mark Zuckerberg is proposing that Section 230 protections should be restricted to those sites with comment screening systems.
Zuck submitted his testimony to the subcommittees on consumer protection & commerce and communications & technology in advance. A lot of it addresses the slippery topic of ‘misinformation’, which had been co-opted by every aspiring political censor. As ever with censorship, the matter comes down to who decides. One person’s misinformation is another’s analysis, pushback or satire.
The more substantial stuff concerned the reform of Section 230, which protects internet platforms from legal liability for what third parties post on them. It’s clear that large parts of the internet, such as all social media, forums and comment sections, couldn’t function without this protection, but it distinguishes platforms from publishers on the understanding that the former doesn’t act in an editorial capacity.
As social media such as Facebook increasingly act to restrict what its users can say on the platform, it’s reasonable to say they’re moving in an editorial direction. Presuming this trend continues, at what point do these platforms become publishers and thus lose their Section 230 protections? That is the key question US lawmakers need to address, especially since they’re often the loudest voices calling for a more interventionist approach by those same platforms.
“I believe that Section 230 would benefit from thoughtful changes to make it work better for people, but identifying a way forward is challenging given the chorus of people arguing—sometimes for contradictory reasons—that the law is doing more harm than good,” said Zuckerberg.
“We believe Congress should consider making platforms’ intermediary liability protection for certain types of unlawful content conditional on companies’ ability to meet best practices to combat the spread of this content. Instead of being granted immunity, platforms should be required to demonstrate that they have systems in place for identifying unlawful content and removing it.
“Platforms should not be held liable if a particular piece of content evades its detection—that would be impractical for platforms with billions of posts per day—but they should be required to have adequate systems in place to address unlawful content. Definitions of an adequate system could be proportionate to platform size and set by a third-party.”
Many commentators have interpreted this as a typical call by a massive company for additional layers of bureaucracy, so expensive that only the largest, most profitable outfits can afford to implement them. This is a classic technique used by incumbent giants to raise the barriers to entry to their market, thus restricting competition, and hopefully lawmakers will see it as such.
A monopolist’s first preference is always “don’t regulate me.” But coming in at a close second is “regulate me in ways that only I can comply with, so that no one is allowed to compete with me.”
— Cory Doctorow (@doctorow) March 24, 2021
Zuckerberg seems to have anticipated this objection with the sentence, “Definitions of an adequate system could be proportionate to platform size and set by a third-party.” But who would this third party be? There would appear to be no amount of administrative cost that Facebook couldn’t take in its stride, but even the slightest miscalculation over what should be expected of smaller sites could drive them out of business. That’s a lot of power for this third party to have.
“In addition to concerns about unlawful content, Congress should act to bring more transparency, accountability, and oversight to the processes by which companies make and enforce their rules about content that is harmful but legal,” continued Zuckerberg. “While this approach would not provide a clear answer to where to draw the line on difficult questions of harmful content, it would improve trust in and accountability of the systems and address concerns about the opacity of process and decision-making within companies.”
Why is he asking Congress to mandate this? Why isn’t Facebook already ‘more transparent’ about its censorship decisions? Again it’s easy to interpret this as a call for additional compliance bureaucracy that would disadvantage smaller competitors.
We have long argued that there is one simple solution to all this: to make 230 protection conditional on platforms not censoring beyond the scope of the law. Yes, this would be helpful to Zuck’s first call for mechanisms designed to more quickly identify illegal material, but it also would remove all possibility of editorial intervention on the part of platforms and pull the rug out from under those calling for social media censorship.