On Child Safety, Graham Proposes Tech Companies Earn Section 230 Protections
Tech companies should be required to follow industry-written best business practices for protecting children’s online safety to “earn” Section 230 liability protections, said Senate Judiciary Committee Chairman Lindsey Graham, R-S.C. That portion of the Communications Decency Act gives websites immunity from liability for the third-party content they host. Congress has chipped away at that immunity, including with anti-sex trafficking legislation passed in 2018 (see 1806290044).
Sign up for a free preview to unlock the rest of this article
Timely, relevant coverage of court proceedings and agency rulings involving tariffs, classification, valuation, origin and antidumping and countervailing duties. Each day, Trade Law Daily subscribers receive a daily headline email, in-depth PDF edition and access to all relevant documents via our trade law source document library and website.
Online platforms aren’t doing everything they can to protect children because there’s no threat of lawsuit, Graham told reporters after Tuesday’s hearing on the topic. He suggested companies get certified annually by a regulatory body, perhaps a newly created entity, to ensure they’re following best practices. Industry and government could also add to the list each year, Graham said. It’s a better option than government telling industry how to run their platforms, he added.
“Earn the liability protections. They were given to make sure the industry would flourish -- mission accomplished,” Graham said. Children’s safety is a “good place to start," he added. “I’d hate to be the Democrat or Republican that were against that one.” Asked about supporting a political neutrality requirement, he said the proposal would be a “test drive” to see what’s reasonable. Congress should “see if it works, then build out,” he said.
“I told [Graham] that I am eager and happy to work with him,” Sen. Richard Blumenthal, D-Conn., told reporters. “There should be overwhelming, bipartisan support for protecting children, given the sometimes nauseating abuse that we see on the internet. Some of these stories are absolutely repugnant, so what better place to start?”
The Internet Association shares the goal of fostering innovation and safety online. And 230 is foundational to the modern internet, CEO Michael Beckerman said in a statement: It enables "platforms of all sizes to moderate harmful content and host [user content]. Changes to this law could have real, negative effects on" apps and services.
Section 230 is part of the reason companies are failing to do more, said Blumenthal. He and Graham agreed the liability protections are the “elephant in the room,” with Graham vowing to use that elephant as leverage to protect safety and innovation. A standard of care is enforceable only if there’s accountability, Blumenthal said.
Silicon Valley isn’t doing enough, testified Protect Young Eyes CEO Christopher McKenna. There’s no reaction until there’s a major news event and company reputations are on the line, he said. These are “brilliant organizations” that could take care of these problems within weeks, he said.
The Family Online Safety Institute supports effective oversight of industry self-regulation, which allows “maximum innovation and the development of creative solutions,” testified CEO Stephen Balkam. He said companies need to create a culture of online responsibility with resilient children who are “good digital citizens.” The internet can’t be made 100 percent safe, he said, noting the benefits youth have from access to information and skills.
Tech companies like Facebook, Google and SnapChat should testify about how they're enforcing child protection provisions under the Children’s Online Privacy Protection Act, Graham said. Blumenthal said the FTC should testify separately on the topic. He also vowed support for legislation from Sens. Josh Hawley, R-Mo., and Ed Markey, D-Mass., that would update COPPA. Blumenthal said he’s waiting for another Republican to also join.
Big tech has shown “callous” disregard for children’s safety, Hawley said, citing Google’s alleged algorithms referring pedophiles to content featuring children. One of the biggest tech companies shouldn’t be privileging profit over kids' safety, he said.
Congress needs more detail on how industry approaches age rating systems for apps, said Sen. Marsha Blackburn, R-Tenn. Parents don’t have the necessary information to make informed decisions about their children’s app use, she said.
It’s unclear if Congress will ever pass major privacy legislation, said Sen. John Kennedy, R-La. Congress has talked about a privacy bill for two years without passing anything, he said, saying that's just how Washington works: “We need a microwave, not a crock pot.”
In child-trafficking cases, law enforcement needs better access to cellphone data, National District Attorneys Association incoming President Duffie Stone testified. Some carriers refuse to assist law enforcement with access to phones, he claimed: “If we’re going to defeat sex trafficking, we’ll need help with that.”
The FTC has failed to enforce COPPA, testified Angela Campbell, co-director of the Georgetown University Institute for Public Representation. The issue is that parents can’t make proper decisions about their children without proper notice and disclosure about the products they’re using, she said. The Markey-Hawley bill is a good first step for updating children’s privacy law, but the FTC needs to be able to bring more enforcement actions, she added.
McKenna offered two solutions for helping better protect children’s online safety: Congress should create a “uniform, independent and accountable rating system” and “enact better defaults based on the age provided during device and app setup.”