Section 230 Ruling Could Boost Censorship, SCOTUS Told in Amicus Briefs
The Supreme Court can’t rule that Internet platforms are liable for their recommendations without also applying that requirement to search engines, wrecking the architecture of the internet and damaging the economy, said a host of amicus briefs last week supporting Google in content moderation case Gonzalez v. Google (docket 21-1333).
Sign up for a free preview to unlock the rest of this article
Timely, relevant coverage of court proceedings and agency rulings involving tariffs, classification, valuation, origin and antidumping and countervailing duties. Each day, Trade Law Daily subscribers receive a daily headline email, in-depth PDF edition and access to all relevant documents via our trade law source document library and website.
Conservative politicians and advocacy groups, tech industry associations and internet entities such as Meta, Reddit and Yelp argued that a SCOTUS ruling narrowing the scope of Section 230 of the Communications Decency Act would be catastrophic for the internet and the algorithms that enable it. “A user cannot benefit from the wealth of information available on the Internet if she cannot find what she is interested in,” said a joint filing from tech industry groups including the Computer and Communications Industry Association, Internet Technology Information Council and the Digital Media Association.
“There is no principled way” for SCOTUS to rule that Section 230’s protections don't cover publishers for targeted recommendations of third-party content without applying the same standards to search engines, said review site Yelp. The Gonzalez petitioners made the opposite argument, but a ruling that removed search engines like Bing from 230’s protection would render “virtually useless the primary tool by which billions of people navigate the otherwise unmanageable amount of information on the internet,” said Microsoft. Recommendations “are not a narrow category that can be easily excised from Section 230’s scope,” said the New York University Stern Center for Business and Human Rights. “They are the essence of what today’s internet platforms do.”
YouTube shouldn’t lose Section 230 protection “for arranging its site in a way that is navigable and relevant for its readers,” said a joint filing from the Cato Institute, R Street Institute and Americans for Tax Reform. “A newspaper does not waive otherwise applicable legal protections for publishing an article when it puts that article on the front page.”
If recommendations aren’t shielded from liability claims, platforms will either remove content at the first objection, or cease organizing it in a usable way to avoid liability, numerous filings said. “By relying on crude sorting mechanisms, like chronological or alphabetical order,” platforms could avoid being accused of endorsing objectionable content, Microsoft said.
Platforms being held liable for content for content they host is most threatening to “marginalized, dissident, and disfavored speakers” who are dependent on platforms to disseminate their message, said a joint filing from the Multicultural Media, Telecom and Internet Council, the LGBT Tech Institute, the Global Project Against Hate and Extremism, and others. “Without such algorithms, users would face a senseless cacophony of irrelevant content,” said job board sites ZipRecruiter and Indeed in a joint brief.
A finding that targeted recommendations aren’t protected from liability would make Section 230 meaningless, said CCIA, ITI and other tech industry groups, plus the American Civil Liberties Union and Daphne Keller of Stanford University’s Cyber Policy Center. “If the recommendation implicit in selecting particular material to display is sufficient to negate Section 230 immunity, there would be nothing left of the statute’s protection,” said the brief from Keller and the ACLU. “Making liable every digital service operator that helps users efficiently access content from third parties” would make the statute useless, said CCIA and others.
Congress explicitly stated the purpose of Section 230 in the statute, and SCOTUS shouldn’t usurp the legislature’s role by narrowing it, said numerous briefs, including a joint filing from Section 230 authors Sen. Ron Wyden, D-Ore., and former Rep. Christopher Cox, R-Calif. “Section 230 protects targeted recommendations to the same extent that it protects other forms of content curation and presentation,” said Cox and Wyden. “Any other interpretation would subvert Section 230’s purpose of encouraging innovation.” Congress recently considered and will consider numerous bills to amend Section 230 and the court should defer to the legislature, said former Sen. Rick Santorum, R-Pa., and the Protect the First Foundation in a joint filing. “Such rewriting is especially inappropriate when Congress is already considering whether and how to amend its own law.”
“Weakening those protections would seriously threaten the health of the internet economy,” said the U.S. Chamber of Commerce. “Petitioners would expose to liability every company that hosts third-party content -- in practice, virtually every company operating on the internet.” Such a ruling “would also make the Internet less competitive,” said a brief from a group of economists including former FTC Commissioner Joshua Wright.
Liability concerns would create barriers to entry, providing fewer alternatives to larger websites and businesses, the filing said. Making it more difficult to moderate content would also have national security consequences, said a joint filing from a collection of former national security officials. SCOTUS could “tilt the battlefield in our foreign adversaries’ favor” by forcing platforms to be cautious about down-ranking dangerous content, the filing said. The case is set for oral argument Feb. 21.