Gretchen A. Peck | for Editor & Publisher
In the simplest of terms, Section 230 — or, more formally, Section 230 of the Communications Act of 1934, enacted as part of the Communications Decency Act in 1996 — distinguishes platforms from publishers. Publishers can be held liable for content they produce and disseminate, but the large tech platforms have successfully argued that they should be exempt from such legal risk, because they’re merely pipelines for information and shouldn’t possibly be held liable for the content its users create and share.
Colloquially referred to as “the law that made the internet possible,” Section 230 unquestionably allowed the platforms to grow, flourish and reach the epic proportions they enjoy today. Meta, for example, made $39.37 billion in 2021 alone, according to Statista.
Section 230 has more recently been scrutinized, especially by members of Congress and a certain former de-platformed president.
It seems, no matter political affiliation, everyone wants to sue the platforms, or at least have the option to sue them. Some critics want the platforms to be held liable for subjective censorship and deplatforming. They contend the rules that Twitter, Facebook and others adopt are subjective, politically biased, and amount to censorship in the digital “public square.”
Other folks want to be able to sue the platforms for the content they do allow on their apps and sites — content deemed or proven harmful, like misinformation about the global COVID-19 pandemic and vaccines, or disinformation campaigns intended to undermine U.S. elections.
You can imagine the lawsuits that would rack up if Section 230 protections were repealed. It would force the platforms to moderate content, to become fact-checkers — publishers, in earnest — and to censor their users more often rather than less. A full repeal would seem antithetical to what some members of Congress say they’re advocating for. A year ago, Congressional Republicans on the House Energy and Commerce Committee said they had a plan for retooling Section 230. They wanted the platforms to make a more concerted effort at detecting and censoring criminal activities, like selling drugs or exploiting children, but they also sought to prevent the platforms from censoring “political speech” — itself a broad and subjective definition.
By July 2021, House Republicans Cathy McMorris Rodgers and Jim Jordan had a draft bill in hand that would require the tech companies to disclose to the Federal Trade Commission, quarterly, about rules on content and how they’re being enforced.
Former President Donald J. Trump was a vociferous proponent of doing away with Section 230, but that was before he got into the social media business himself, with the launch of his TRUTH Social platform — with an interface that happens to look a lot like Twitter. In an op-ed published by The Week in February 2022, authors Nicole Saad Bembridge and Trevor Burrus wrote about the rock-and-a-hard-place position in which TRUTH Social found itself almost immediately. Users are promised a “family friendly” experience, requiring moderation, removal of “hate speech, spam, pornography and bullying,” and even — gulp — the banishing of users who don’t abide by the rules, just like the platforms he criticized for de-platforming him.
Though the former president is now occupied with flirtation with a future run and his new business venture, for which he put former Congressman Devin Nunes in charge, lawmakers left in D.C. carry on the quest to cancel Section 230.
To get a sense of how laser-focused D.C. lawmakers are, consider these statistics, courtesy of Quinta Jurecic, a Brookings Institution fellow in governance studies, who penned a feature for Lawfareblog.com on March 15, 2022: “In the 116th Congress, lawmakers formally introduced more than 25 bills to amend or repeal the statute. The 117th Congress has already seen almost 30 such proposals. Arguments over technology policy and internet regulation pull in a number of different directions, but almost everyone seems to agree that something should be done about [Section] 230 — even if nobody can agree what that something is,” she wrote.
And that, of course, is the most important question: How to amend it so that platforms can still exist but also be held more accountable?
David Chavern, president and CEO of the News Media Alliance
As the president and CEO of the News Media Alliance (NMA), David Chavern advocates for the interests of the group’s more than 2,000 member organizations. He’s been a proponent for retooling Section 230 and penned an op-ed on the topic for Wired magazine in 2020 titled, “Section 230 Is a Government License to Build Rage Machines.” In it, he argued, “It’s time for strong amendments.”
E&P recently followed up with Chavern to learn more about how news publishers stand to gain if Section 230 gets an intelligent Congressional overhaul.
“At a very macro level, you have what is a pretty extraordinary exemption from liability that started out as a pretty simple idea, which is, the platforms are not responsible for what their users post on their services,” he explained.
“But what that really means is, even if you do bad things [as a platform], you’re not liable,” he added.
Chavern acknowledged that when Section 230 was conceived, it was a different time, when the internet was still fledgling. But the way platforms now operate is no longer like a public utility, or a “passive, dumb series of pipes,” he said. Now, they are “extremely active pickers and choosers of content.” He’s referring to the notorious proprietary algorithms that the platforms designed to deliver customized experiences to users and big-but-targeted audiences for advertisers.
“The reason why news publishers care is because our content ends up being in competitive environments on these platforms — Google and Facebook, in particular,” he said. “The incentives for the platforms are really warped by the fact that there is no liability. Out of the billions of pieces of content out there, Facebook gets to decide which content you’re exposed to on the platform. They do very editorial things.
“Their incentives are to pick the things that keep your attention, not the things that have quality. The trouble for news publishers is, we actually invest a lot in quality, expensive content. It’s actually a lot cheaper to just make things up. … But the bottom line is, if Section 230 allows platforms to not care about information quality, we’re in a bad place, because our primary value is quality,” Chavern stressed. “Now, we end up competing for people’s attention on these platforms against straight-up made-up garbage.”
Chavern suggested a good starting point for amendment: Legislate away the proprietary algorithms, forcing more public transparency about how the platforms operate and allowing users to directly choose the content they experience online.
“All of the sudden, they would care more about quality,” he said, though he acknowledged that it would impact the platforms’ relationship with advertisers. “Facebook chooses, individually, for people what they will see, and they’re rewarded for this, because that is the key to their advertising business.”
If content-curation power was handed over to users? “You could see a world in which there would now be a premium placed on quality content from news publishers,” Chavern said.
At least one Supreme Court justice seems ready to take up cases challenging Section 230. In early March, Justice Thomas wrote an opinion about the denial of certiorari in the Doe v. Facebook, Inc. case. He affirmed the denial, but concluded, “We should, however, address the proper scope of immunity under §230 in an appropriate case.”
Gretchen A. Peck is a contributing editor to Editor & Publisher. She’s reported for E&P since 2010 and welcomes comments at firstname.lastname@example.org.