Blumenthal Preps Exploration of Section 230, SCOTUS Arguments
Section 230 should be made less of an applicable defense when platforms actively promote content that results in real-world harm, Senate Technology Subcommittee Chairman Richard Blumenthal, D-Conn., told reporters Thursday.
Sign up for a free preview to unlock the rest of this article
Timely, relevant coverage of court proceedings and agency rulings involving tariffs, classification, valuation, origin and antidumping and countervailing duties. Each day, Trade Law Daily subscribers receive a daily headline email, in-depth PDF edition and access to all relevant documents via our trade law source document library and website.
The subcommittee will host a hearing at 2 p.m. Wednesday that he said will explore concepts DOJ presented to the Supreme Court during oral argument in Gonzalez v. Google (see 2303010061). Several Democrats and Republicans told us the Communications Decency Act and Section 230 need to be rolled back to open up platforms to liability.
“I believe in personal responsibility and accountability, and I think we need to revisit Section 230 because obviously it’s being used as a shield against censorship and other abuses,” said Sen. John Cornyn, R-Texas. “So I’m certainly interested in revisiting that.”
Algorithms inject platforms into the content-creation process, and it’s a platform’s decision, said Sen. Rick Scott, R-Fla.: “They act like publishers. They ought to be treated like publishers.”
Blumenthal spoke at length about social media algorithms and the harm to children using mobile apps. Republicans, including House Technology Subcommittee ranking member Josh Hawley, R-Mo., have focused on both child-safety issues and alleged censorship.
Holding platforms accountable for their algorithms, for instance in situations involving wrongful deaths, would require amending, changing or repealing Section 230, said Sen. Chuck Grassley, R-Iowa: “I’m in favor of making platforms more liable.”
“I think there’s a case to be made,” said Sen. Thom Tillis, R-N.C. “Maybe we need to look at it and revise what our expectations are around Section 230.” There needs to be a balance between preserving the intent of Section 230 and promoting innovation, he said: Targeted carve-outs are the aim. “If you hit a whole swath, you’ll get a lot of unintended consequences,” he said.
“I think there should be liability in certain situations, absolutely,” said Senate Consumer Protection Subcommittee Chairman John Hickenlooper, D-Colo. Blanket immunity allows abuse, he said: Platforms have maximized profit with “clear knowledge that young women were taking their lives. That’s unacceptable, and the only way you can make sure corporations don’t behave that way is make them liable.”
The Pact Act, from Sens. Brian Schatz, D-Hawaii, and John Thune, R-S.D., offers a good solution on Section 230, said Sen. Ben Ray Lujan, D-N.M., a co-sponsor of the bill (see 2302160066). “That should be the basis for how we work together. I support their legislation.” Separate from the bill, there should be a close look at what’s transpiring, and Congress should be “open to take necessary action,” said Lujan.
Senate Intelligence Committee Chairman Mark Warner, D-Va., spoke in favor of his approach under the Safe Tech Act (see 2302280060). The thrust of the legislation is that if it’s illegal in the real world, Section 230 shouldn’t be a uniform defense against liability for online activity, he said: Section 230 would no longer be a “universal, get-out-of-jail-free” card.
“We need to take a very deep dive,” said Sen. Joe Manchin, D-W.Va. “I’m all in favor of revisiting 230 and making adjustments as needed.” Sen. Elizabeth Warren, D-Mass., said she will have more to say about Section 230 once she’s completed her work introducing legislation for a new tech regulator with Sen. Lindsey Graham, R-S.C. (see 2209160053). Section 230 hasn’t figured heavily in public discussions about the pending bill from Warren and Graham (see 2209120059). Asked about a timeline for introduction, she said, “Soon, very soon.”